Tag Archives: MaxMSP

Global Perspective Development

Initial Project Pitch:

For my Global Perspective project, I really enjoyed games systems within improvised music, so for my final project I aim to develop a improvised battle system, where two people can create music together using gaming controllers. The idea of the project is to put you into gaming mindset, whilst also still creating music. I think the project will also benefit and allow gamers who aren’t necessarily musicians to be able to create music.

I plan to develop a MAXMSP patch to be able to convert a Xbox 360 games controller into a midi device. I am hoping to get the most out of controller as possible by utilizing as many buttons possible, to give the player a greater amount of sounds and possibilities.

I hoping to create a system that is easy enough for people to simply pick up and play and allows for quick fun improvisation games.

______________________________________________________

The Development:

Within in another module this year, I had already began working on developing a MAXMSP patch converting game controllers into a midi devices. So by simply expanding upon that development, I was successfully able to get a fully functional patch.

This is a few images of my patch in several stages of the development:

After I was happy that I got the controllers working correctly and that I had thoroughly tested them out several times, I then started to midi map within ableton. I colour coded the samples I used in correlation with the colours of the controller.

ColourCodeScreenShotAbleton Colour Code

I used start and back button on the controller to trigger percussion and the Y, B, A and X buttons to trigger samples and sounds. I also midi mapped the right (RB)  and Left (LB) trigger buttons for one to allow me to loop any of the samples or sounds and the other to bypass or turn on distortion.  I mapped the analogue sticks for one of them to change pitch/transpose and for the other to control the amount of distortion. I repeated this process for the second controller, I knew it be important to keep the controls and functions the same for each controller.

Midi

Midi Mapping

Upon development, I decided for one of the controllers to trigger bassier sounds than the other. So, I selected higher pitch sounds for the other. After I found a various of sounds for each the controllers, I then past it onto a friend to test out, before finally testing with the class. The feedback I got back from trial runs, from both a friend and the class was along the same lines, of which they both agreed it would work a lot better if the controls diagram were left on the screen to refer to at all times. So, I took this feedback on board and decided to give and put the control diagram on the screen for my final performance.

Controller Controls

The Game Controls

Another thing I took into account in the test runs were that improvisation can go on a long time so, I decided in keeping with the theme of making it a games system, I took several space style videos and cut them all down to 2 minutes in length creating levels. In my final performance players can now choose from 5 selectable levels to improvise to. I felt by giving a time restriction not only allowed for a game style challenge, but for players to think on their feet. The time restriction also allowed for more people to get involved.

Lvl

The Five Selectable Levels

 

Once doing several trial runs, I decided to also pan the output sounds of each of the controllers, creating left verses right. This would make it clearer for not only the players to be more aware of what their individually creating, but also for views to hear the difference between what the two players are doing.

v-Games Controller Video Development Diary-v
(The Testing Process)

For my set-up, I plan to have two laptops, set out being one on the left and the other on the right. This would correlate with the panned controllers. (Being left on the left right on the right) The two laptops will be displaying a still image of the controls to the game.  In-between the two laptops I also plan to use a projector to project the selected levels. The two minute video will hopefully put the two players into a certain mindset/zone.

______________________________________________________

Conclusion:

Overall I am rather happy with the outcome of my project, I feel I achieved what I set out too and it naturally developed quite steadily, which brought it all together slowly. I feel I created a rather simple straightforward improvised game system, which is a nice middle ground between being a video game and creating music. If I were to do it again, I think Id give more of an explanation and maybe a demonstration on the day of the game system, for people to get a better idea of what they were doing. I would also tighten up the video/levels by adding the timer on screen and clear ending of the video, so people knew when the round was over. If I were to take this project further I would spend time tightening the project up and making those few changes I mentioned.

v-Final Performance-v

Research:

List Of Music Games

Looks Like a Game Controller, Plays Like a Chip Instrument

Cannon Fodder theme played on Game Controllers

Playing music with the Steam Controller Haptic Actuators : Still Alive

MUSICAL IMPROV GAMES


10th November – 25th Of January ~ Final Project.

The idea & Project pitch:

For my final Emerging Technologies project, I want to create and develop a live performance tool for myself, that I can use as a unique playable instrument at a live show. I aim to take a guitar hero controller and convert it into a midi controller device. I am hoping to open up the case of the guitar and use the Arduino/teensy inside.

My initial idea is to use the buttons already found on the guitar hero to trigger sounds. I also want to add a rotary dial within the design somewhere on the body of the guitar to change possibly the pitch or filter. I also want to add some metal wire through the frets of the guitar to give a larger range of sounds. Lastly the two buttons found near the bottom of the body, I want to convert into drum samples so, by pressing them you can create a beat using a kick drum & a snare.

VInitial Idea diagramV

Fist template

______________________________________________________

Guitar Hero Development Process:

This picture is the guitar with no alterations or modifications made, this is the foundations and device I am going to be working with. I first looked into the possible idea if the guitar hero was usb, due to the alternative possibility that I could create a MAXMSP patch, then converting it into a midi device from there. After I was unable to achieve this, I set out on my initial idea, of which I was going to modify the guitar and run it through a teensy device.

1

Original Guitar Hero Case

I firstly opened up the neck of the guitar hero, to expose the circuitry of the inside of the neck. I then removed the fixture of the board and all of the wires found within. (hollowing it out)

I then realized the board attached to the inside of the neck held the buttons in place and was also how the buttons worked. It was difficult to figure out how I was going to make the buttons work with the teensy. After a long process of experimentation, I really couldn’t achieve this so, I then set out to think of other possible ways I could still achieve my goal. After a little while, I came up with the idea of covering the buttons in a metal substance, then running metal wires from them to the touch sensors in the teensy. This worked and give me the effect and functions I was after. I then tested to ensure the method worked.

Once I was happy and felt confident the buttons were functioning correctly, I moved on to achieve the touchable frets. By drilling small holes into the neck of the guitar I was able to push through small thin wire, which I positioned and pulled tightly against the fret ridges of the design.

I then moved onto removing the strumming function of the guitar hero, I did this due to wanting to be able to see the teensy and be able to turn it off and on directly. By opening up the body, I place the teensy inside then taped it down from behind, holding it firmly in place. The positioning of the teensy also allow for easy accessibility to be able to attach the running wire down from the neck of the guitar. By also removing the whammy bar of the guitar hero, I was able to run the wire from the teensy out through the empty hole. (where the whammy bar use to be).

Once I felt happy with all the alterations I made, I tested the overall product out several times and also got a friend to test it. I realised due to no dials or changeable effects the guitar didn’t quite have the overall impact I was looking, I knew it needed something else, as it was possible to do a lot more with it.

v-Testing Video Diary-v

______________________________________________________

Midi Xbox 360 Controller Process:

As I knew I really enjoyed creating and developing the games controller using MAXMSP and that was something I wanted to look into taking further, I come up with the idea of developing the controller into a midi device, which would then control and change the guitar hero, giving it a lot more possibilities and functionality.

I first began by taking my previously developed patch and simply expanding upon it. I looked into using MAXMSP as a midi device, I then went on to delete the functions of triggering sounds within my previous patch and replacing that function with midi send function within MAX.

Once I got the Xbox 360 controller fully functioning, I then started to midi map within Ableton. I colour coded the samples I used in correlation with the colours of the Xbox controller. I used start and back button on the controller to trigger percussion and the Y, B, A and X buttons to trigger samples and sounds. I also midi mapped the right (RB)  and Left (LB) trigger buttons for one to allow me to loop any of the samples or sounds and the other to bypass or turn on distortion. I mapped the analogue sticks for one of them to change pitch/transpose and for the other to control the amount of distortion. This would work along side the playing of the guitar, allowing you to change the pitch and level of distortion of each note you play.

This

The Final Product.

Final Product Testing:

______________________________________________________

Research:

Guitar Hero:

DIY Arduino Ribbon Synth

Arduino Guitar

Servo Bender

Arduino MaxMSP Guitar

Electric Guitar, Arduino, and Max/MSP/Jitter

Dubstep Guitar Demo by Mukatu

Music with Guitar Hero Controller

Controller: 

Connecting a Joystick to MaxMSP/Jitter

Xbox 360 Controller MaxMSP Video

Max MSP Sampler/Looper Wireless Interface Patch

XBox 360 OSC Controller

MAXMSP – MIDI Tutorial 1: Basic MIDI

______________________________________________________

Conclusion:

I feel my project overall turn out rather well, and I personally feel I achieved what I set out to at the start of this project. The development of the project really challenged me to think of ways to over come obstacles, whilst not losing what I wanted to achieve. I feel I managed to scale down a rather complicated idea to my own technological level. So, I feel this was a personal success, in which I am happy with the outcome. If I were to do this again, I think I would focus more building upon the teensy and integrating the Xbox controller within in the guitar hero controller body itself. Overall I wasn’t too pleased with the final performance, I mistakenly brought an old template rather than the latest Ableton file. Due to this I felt it was more of a short demonstration than performance. So, If I were to do it again, I’d make sure I was more organised and practise more with the instrument in a improvised situation.


27th – 3rd Of November ~ Assignment #2: MaxMSP Patch

Slide one – The Brief: Using the tools and techniques covered in sessions 4 & 5, create your own piece of software using MaxMSP that takes an external input such as a camera feed, video, game controller or smartphone to control sound in someway. We can use any of the technologies covered in class. The aim of the assignment is to think about who will use the work, what it is being used for, where it is being used, how, and why. It is in this sense that you can start to interrogate the needs, approach, benefits and competition of the application.

Slide two – The Idea: By using some of the technique’s we covered in previous sessions of Emerging Technologies, Iv decided to create and develop a MaxMSP patch, that will allow a Microsoft games controller to trigger audio playback.

This item would be for personal live use. By self branding this gimmicky item, making it a unique feature to a live stage show, this would bring something different to your viewers, making you stand out from the crowd. Hypothetically, after awhile you could go on to brand and personally sell this item as a product.

Slide three – The Hypothetical Controls: These are the first draft of the controls I set out to achieve. I really wanted to get as much out of the controller as possible.

COntrolls

Slide four – The MaxMSP Patch: The Max patch took a lot of work and figuring out, I first started off by not only getting the MAXMSP to identify the controller, but to read all the various numbers, that was coming in from it. I then moved onto separating those numbers and identifying the parts of the controller. I found a great tutorial of recreating the Xbox 360 controller online, which I followed and give me a nice and clear layout of the controller on MAX. Once I finished and was happy that I achieved identifying the components of the controller, I went onto assign sounds to the controls, based on the method we covered in class.

Slide five – The Conclusion: The controller didn’t completely turn out the way I would have liked it. Although most of the buttons worked fine and I achieved the controller making sounds, I would have preferred to achieve all of the controls working and running a little smoother. I know I personally would like to take this project further and convert the controller into a midi device, that would allow the controller to trigger sounds directly from DAW software.

v The Project Pitch Presentation v

2nd Assignment Prezi

Research Links:

Connecting a Joystick to MaxMSP/Jitter

Xbox 360 Controller MaxMSP Video

Max MSP Sampler/Looper Wireless Interface Patch

XBox 360 OSC Controller


27th Of October – #3.MaxMSP Recap

Looping Patch

Today we looked into developing a patch that not only looped but also allowed you to manipulate a saw-waveform. The phasor~ in the patch is the saw-wave, which is stereo split into two separate message objects, containing the command *~  which in Maxmsp is a signal multiplier-operator that outputs a signal which is the multiplication between two signals. The next message box is the play~loop, which is Maxmsp way of reading and playing back the loop files found on the computer. This then goes into gain~ and the gain meters allows you to adjust each of the levels separately to simply mix the two sounds or bring one in and out. Finally the gain is wired up to ezdac~ which appears as a button which can be clicked with the mouse to turn audio on or off.

loop patch


23nd Of October – #2.MaxMSP Follow Up

My thoughts on – Joseph Paradiso 1998 analysis of Electronic Music Interfaces

Reading Joseph Paradiso analysis and knowing it was written in 1998, you could kinda tell he predicted or had an idea where music and especially electronic interfaces were heading and I found myself agreeing with his opinion on how Electronic music, in contrast, has no such legacy to classic acoustic instruments. He also said how The field has only existed for under a century, giving electronic instruments far less time to mature and I still feel that is still true even now.

He also went on to say the rapid and constant change in electronics including the conversion into digital and predicted how computers would play a major part in producing modern music with no additional hardware and this is undeniably true in today’s standards. Finally he also foreseen the rise of performance away from conventional electronic interfaces by saying: In the not-too-distant future, perhaps we can envision quality musical performances being given on the multiple sensor systems and active objects in our smart rooms, where, for instance, spilling the active coffee cup in Cambridge can truly bring the house down in Rio. What we’ve covered in class alone revealed this is now possible and easier than ever.

In conclusion, I  agreed with Joseph Paradiso’s analysis and feel he really had a clear understanding where music was heading and his predictions for the not-too-distant future are now truer than ever. I think the possibilities of  music these days are almost endless from producing to performing, and it is even harder than ever to predict or foresee what is in store for further future possibilities in music. But it is undeniably that classic acoustic instruments will always play a big part in music legacy.

_____________________________________________________

MaxMSP Research Links

A couple of further research links covering the methods we’ve used so far:

A remote controlled visualization of the sound generated in max/msp: Sound Sync in MaxMSP

This project combines complex feedback system and self-organization system in an interactive way in order to generate amazing and spacious sounding. It visualizes the phase differences and transforms them into dramatic graphics, trying to control the unpredictability and diversity with endless and evolving feedback effects. People’s motion affects the sound and therefore change the final visual image as well as feedback effects: Audio-Visual Feedback System on Max/MSP

A great project of someone who used a Nintendo Wii controller to wirelessly modify their guitar effects: Wii Guitar Max/Msp


20th Of October – #1.MaxMSP

Audio Reactive Video

Within class we covered triggering and synchronizing visuals with sounds. By using the macintosh built in microphone to pick up the sound, we developed a patch that lip synced a video when sound was picked up.  By using several object and message possibilities in Max, we picked a video and set the start and end point. This would be the first and last frame. Then the sound coming in would trigger the motion of the video. The sound flicked and changed the frame from the start to the end point, as the sound was coming in.

T (toggle)  – A overall on or off switch.
ezadc~ – This is the object name for the microphone interface.
gain~ – The level controller of the microphone.
meter~ – A visual meter that shows the level of the output sound in dB.
scale – Scale maps an input range of float. The ranges can be specified with hi and lo reversed for invert-mapping. If specified, the mapping can also be exponential.
number – Number is a number-box used to display, input, and output integer numbers. By setting the maximum attribute to the last frame in inspector, so that the video can’t go past this point.  Also, when audio is off, the number box will display a good number frame range for a start point.
M (message) frame $1, bang –
jit.qt.movie – The jit.qt.movie object provides a full suite of services for QuickTime movies: playback, editing, import, export, effect generation and direct-to-video-output-component streaming.
M – Read cat.mov – Message displays and sends any given message with the capability to handle specified arguments, in this case read/open cat.mov file.
loadbang – loadbang output is triggered automatically when the file is opened, or when the patch is part of another file that is opened.
Jit.pwindow – The jit.pwindow object takes a Jitter matrix and displays the numerical values as a visual image in a window you can place in any patcher.

 Talking Patch
T (toggle) > ezadc~ >gain~ >meter~ >scale >number >M (message) frame $1, bang >jit.qt.movie >M – Read cat.mov >loadbang >Jit.pwindow.

v Screen shot of final working patch vTalking Patch (2)

_____________________________________________________

Basic Motion Detection. 

This patch we developed in class covers the basics of motion detection. By using the built in camera on the macintosh, we triggered sound by creating a patch that read and translated any motion picked up by the webcam. The patch was set up so a certain amount of motion would trigger one of 4 sound effects. As you can see by the patch if the value was more than 0.05 then change. Small amount of movement would trigger the first sound effect, then more motion picked up by the webcam will change the active sound effect.

Chop patch


2nd Of October – Continuation Research

Bionicle robots, controlled by an Arduino Uno, which is hooked up to a MIDI sequencer – Arduino – Lego Band

Techno and electronic music played and controlled by light sensor using the Arduino – Arduino – Light Sensor Controller

This project combines complex feedback system and self-organization system in an interactive way in order to generate amazing and spacious sounding – MaxMSP – Audio Visual Feedback

A printed customizable music box playing the first 3 measures of “Frère Jacques – 3D Printed Music Box

A incredible custom software with 3D tracking cameras interactive sonic art exhibition – Custom – Barbican’s Rain Room

Custom made guitar hooked up to synths – Custom – Dubstep Guitar