Difference between revisions of "MidtermProject"

From Robert-Depot
Jump to: navigation, search
(Title)
(Description)
Line 7: Line 7:
  
 
*Interactive paradigm
 
*Interactive paradigm
It uses the viewers movements to control different sounds a song. I would like to use the frame differencing method and utilize the way it quantifies the amount of movement in the video frame to control different parameters/filters of a songSome of the control parameters are going to be the speed, treble, key, and volumeI’m hoping that this project will create an interaction between what the see on the screen and how their movement affects the song that is playingIt will give way to a unique experience with a music that does not involve electrical controls, turntables, dance pads or even an actual instrument. But rather, it gives your body some control over the sounds you hear through your own motions.
+
This program uses the brightness tracking in the Video library of Processing to track the brightest pixelUsing a flashlight, the user controls the frequency and amplitude of piano synthesized MIDI notesSimultaneously, a predetermined array of notes plays at random and creates random squares that appears on the screen.  This creates an improvisational live "duet" between the participant and the program.
  
 
*Technical Description
 
*Technical Description
Uses the frame differencing methods to capture the amount of movement in a frame and then send those numbers to Logic Pro to control different filters that would then change the sound of the song.  The input would be the amount of movement in the video frame and the output would be a change in the sound of the songsI would split the frame into quadrants, one for a different filter.  An amount of movement also has to be specified to control the amount changeFor example, the speed at which the participant is moving in the quadrant that controls the amplitude of the song determines how loud of soft the song is playing.
+
The flashlight held by the user will be tracked by the brightness tracking method in Processing's video library.  The top half of the screen is designated to the area where the user can control the MIDI notesLeft to right controls the amplitude, while Up and Down control the frequencyWhile the bottom half of the screen is designated to the random squares that create a sort of rhythmic dance on the screen while playing an improvised melody for the user to interact with.  I used the Soundcipher library to control the MIDI sounds.
  
The flow in of information will go as follows: A song would begin to play and the user would move and interact with the screen. Processing will then take the amount of movement in the designated quadrants of the screen and send those numbers to Logic Pro. Logic Pro has an environment that is able to process numbers and use them as controls to different filters and parameters.  The changes in the parameters would then be transferred to the audio speakers, and finally the user can hear how their movements changed the sound of the song.
+
The flow in of information will go as follows: As the user moves the flashlight across the screen, it will change the MIDI note's parameters (amplitude and frequency) while the random squares play a melody on the bottom half of the screen.  The ending result is an interactive MIDI sound dialogue between the user and the program.
  
 
== Visualization ==
 
== Visualization ==

Revision as of 16:07, 20 May 2010

Title

MIDI Capture

Description

  • Motivation

I have a strong interest in music and manipulating sound. I wanted to do a project that would spin off those interests and combine that with something interactive involving video capture methods in Processing.

  • Interactive paradigm

This program uses the brightness tracking in the Video library of Processing to track the brightest pixel. Using a flashlight, the user controls the frequency and amplitude of piano synthesized MIDI notes. Simultaneously, a predetermined array of notes plays at random and creates random squares that appears on the screen. This creates an improvisational live "duet" between the participant and the program.

  • Technical Description

The flashlight held by the user will be tracked by the brightness tracking method in Processing's video library. The top half of the screen is designated to the area where the user can control the MIDI notes. Left to right controls the amplitude, while Up and Down control the frequency. While the bottom half of the screen is designated to the random squares that create a sort of rhythmic dance on the screen while playing an improvised melody for the user to interact with. I used the Soundcipher library to control the MIDI sounds.

The flow in of information will go as follows: As the user moves the flashlight across the screen, it will change the MIDI note's parameters (amplitude and frequency) while the random squares play a melody on the bottom half of the screen. The ending result is an interactive MIDI sound dialogue between the user and the program.

Visualization

  • Functional Diagrams

Flow Diagram Flowdiagram.jpg


Sample Screenshots (What the user will see) Screenshots.jpg