Difference between revisions of "MidtermProject"

From Robert-Depot
Jump to: navigation, search
(Title)
Line 1: Line 1:
 
== Title ==
 
== Title ==
MotionDJ
+
MIDI Capture
  
 
== Description ==
 
== Description ==

Revision as of 15:58, 20 May 2010

Title

MIDI Capture

Description

  • Motivation

I have a strong interest in music and manipulating sound. I wanted to do a project that would spin off those interests and combine that with something interactive involving video capture methods in Processing.

  • Interactive paradigm

It uses the viewers movements to control different sounds a song. I would like to use the frame differencing method and utilize the way it quantifies the amount of movement in the video frame to control different parameters/filters of a song. Some of the control parameters are going to be the speed, treble, key, and volume. I’m hoping that this project will create an interaction between what the see on the screen and how their movement affects the song that is playing. It will give way to a unique experience with a music that does not involve electrical controls, turntables, dance pads or even an actual instrument. But rather, it gives your body some control over the sounds you hear through your own motions.

  • Technical Description

Uses the frame differencing methods to capture the amount of movement in a frame and then send those numbers to Logic Pro to control different filters that would then change the sound of the song. The input would be the amount of movement in the video frame and the output would be a change in the sound of the songs. I would split the frame into quadrants, one for a different filter. An amount of movement also has to be specified to control the amount change. For example, the speed at which the participant is moving in the quadrant that controls the amplitude of the song determines how loud of soft the song is playing.

The flow in of information will go as follows: A song would begin to play and the user would move and interact with the screen. Processing will then take the amount of movement in the designated quadrants of the screen and send those numbers to Logic Pro. Logic Pro has an environment that is able to process numbers and use them as controls to different filters and parameters. The changes in the parameters would then be transferred to the audio speakers, and finally the user can hear how their movements changed the sound of the song.

Visualization

  • Functional Diagrams

Flow Diagram Flowdiagram.jpg


Sample Screenshots (What the user will see) Screenshots.jpg