Midterm Project

From Robert-Depot
Revision as of 15:24, 13 May 2010 by Jrull (talk | contribs) (Documentation)

Jump to: navigation, search



  • Motivation

We were both really inspired by the youtube video shown in class with the zen master and how he was able to control music tracks just with his arm and leg motions. Although our interests usually lie outside of this medium, we are very interesting in manipulating processing into doing something similar to what we saw. Our interests lie usually in the traditional construction of art so for the preliminary idea for our project we have decided to make a processing project based upon the mini projects we made last week.

  • Interactive Paradigm

Using the webcam, we will track the person with the brightest pixel and make an interactive animation with the different vectors animations already given on processing. This will be more of a game exploration of all the different arenas within the screen. There will be objects set up through processing, not apparent in the real world but only existing within the processing window. Depending on where the pixel will go, it will demolish what object we have placed there.

There are different themes that we have been thinking about. There are such things as musical notes where if you touch it and demolish it, a track will play or having some sort of narrative built into these objects.

(1) Dragon.

The pixel has to touch a sword in the background. Once it is activated. the Sword matches the movements of the pixel. A dragon appears and moves around the screen like a ball. Your job is to stab it with the pixel but you have to move around in the screen to stab it. Once it makes contact, maybe a big red splot (another .jpg file) will splash on the screen

(2) Building

There will be blocks strewn all around the room. Using the brightest pixel, once it touches a block it will move with the pixel. It can be placed on top of each other...maybe building a castle. Maybe interactive tetris.

(3) Pesky Fly

There will be a fly bouncing fast around the screen. You are the flyswatter.

(4) Simon Says

User does things according to what the screen commands them.

  • Technical Description

Using the codes we both developed in processing, we want to construct an interactive project based on what was seen through the webcam. There will be regions of space on the screen and depending on where the user creates movement detecting the brightest pixel on the screen, the different regions will activate and create a sound presentation.


  • Functional Diagrams


  • Visual Concept

Dragon.jpg Fly.jpg Bricks.jpg Simonsays.jpg



we have constructed a game in which YOU the user is controlled by the computer. What gets ridiculous is that as the game goes on the computer gets more and more...playful. Soon you will be leaping and jumping and diving for what it asks you to do. The idea is easy. Touch the square with the particle beam and you will move on. Sadly, the square jumps from location to location after a short amount of time. How long you ask? only we know and you will have to find out. Soon the user discovers that the time is decreasing and decreasing spawning erratic movements from the game player. Once you hit (5) squares, it takes a screen shot of you and saves it in a folder to be used otherwise.


We can't help but see that in this game, you are not the one to manipulate the environment but rather it's the technology that tells you what to do in the real world. In almost a flip of the virtual roles people play in games, we are controlled in the real world by the virtual squares that appear on the screen. And because we only save screenshots from the most ridiculous situations from games that we play, we have the camera set to take only the most frustrated attempts at completing the game. Appearance matters little to the randomly moving blue transparent Square. Media:Messy2.pde