Changes

MidtermProject

610 bytes removed, 23:07, 20 May 2010
Description
*Interactive paradigm
It This program uses the viewers movements to control different sounds a song. I would like to use brightness tracking in the frame differencing method and utilize the way it quantifies the amount Video library of movement in Processing to track the video frame to control different parameters/filters of a songbrightest pixel. Some of Using a flashlight, the control parameters are going to be user controls the speed, treble, key, frequency and volumeamplitude of piano synthesized MIDI notes. I’m hoping Simultaneously, a predetermined array of notes plays at random and creates random squares that this project will create an interaction between what the see appears on the screen and how their movement affects the song that is playing. It will give way to a unique experience with a music that does not involve electrical controls, turntables, dance pads or even This creates an actual instrument. But rather, it gives your body some control over improvisational live "duet" between the participant and the sounds you hear through your own motionsprogram.
*Technical Description
Uses The flashlight held by the frame differencing methods to capture user will be tracked by the amount of movement brightness tracking method in a frame and then send those numbers to Logic Pro to control different filters that would then change the sound of the songProcessing's video library. The input would be the amount top half of movement in the video frame and screen is designated to the output would be a change in area where the sound of user can control the songsMIDI notes. I would split Left to right controls the frame into quadrantsamplitude, one for a different filter. An amount of movement also has to be specified to while Up and Down control the amount changefrequency. For example, While the speed at which bottom half of the participant screen is moving in designated to the quadrant random squares that controls create a sort of rhythmic dance on the screen while playing an improvised melody for the amplitude of user to interact with. I used the song determines how loud of soft Soundcipher library to control the song is playingMIDI sounds.
The flow in of information will go as follows: A song would begin to play and As the user would move and interact with moves the flashlight across the screen. Processing , it will then take change the amount of movement in MIDI note's parameters (amplitude and frequency) while the random squares play a melody on the designated quadrants bottom half of the screen and send those numbers to Logic Pro. Logic Pro has an environment that is able to process numbers and use them as controls to different filters and parameters. The changes in ending result is an interactive MIDI sound dialogue between the parameters would then be transferred to the audio speakers, user and finally the user can hear how their movements changed the sound of the songprogram.
== Visualization ==
28
edits