My final project will be a music visualization system for live audio, specifically catered to small live bands. The system will consist of software to drive the visuals and hardware that allows the musician to interact with them. The visuals will be projected and will show a history of the sonic environment, not constantly changing but rather a somewhat cohesive evolving form that is effected by music over time. The visuals are fed into a controller that lets the visual projectionists fine tune the visuals as they happen.
In addition to visualizing the audio environment, I’d also like to pair this with a sensing element that the performer can wear. This will be a motion sensor or similar that tracks movement and allows the performer to interact with the visuals. I’m thinking of this as an open source project that will be well documented so that other people could use it and adapt it for their own audio projects.
I plan to begin by developing the software component using an audio analyzer that I developed in MAX/M.S.P. last semester and building off of that. For the hardware component I’m still investigating whether I want to use something pre-existing like a WII-Mote, or if I want to work with a wireless/bluetooth arduino. I also need to do a lot more research of current/past music visualizations in order to situate the project in the appropriate context.