20 May, 2010
People Watching Plus at Come Out And Play
The game I did with a couple friends, People Watching Plus, have been accepted
into the Come Out And
Play festival 2010. If you're in New York June 5th, stop by the Old Stone
House at 1pm and play test a fun and not so competitive game. Read more about
the rules and location on the
Come
Out And Play festivals website.
05 May, 2010
Balloon Projections
This video shows a series of audio responsive animations projected onto a
cluster of balloons. The projections are going to be used as a part of the stage
show for an upcoming US tour. The balloons will be mounted onto an old barrel
organ that is a part of the musical performance.
The animations and the software framework is all done in OpenFrameworks. The
framework is a Model-View-Controller framework that consists of a number of
elements. By pressing "r" you will set the application into record mode, which
will enable you to click on the screen to create a new ballon model object and
assign the current mouse location to it. You can also plot the points directly
on a video feed via the control panel (pressing space), which will automatically
map the video coordinates to screen coordinates.
Each ballon can be positioned and scaled individually via a custom control panel
or via shortcuts on the keyboard. You can also displace the position and scale
of all the ballons if needed. This is useful when mapping from the video feed to
the projection space. You can save all the Balloon model objects to XML by
pressing Shift - S and load them into the program by pressing Shift - L. The
control panel will also keep track of it's settings from build to build via XML.
There a 5 animations in the video: An animation showing coloured circles, an
animation showing a PNG image of a head, an animation showing a sky of stars on
each balloon, an animation showing singing heads and finally an animation
showing letters. These animations can be controlled via MIDI: all of the
animations extends the Animation class, which automatically takes care of the
following process:
The selected animation (the one current being shown) will get all of the balloon
model objects from the Sensing class and automcatically create a
BallonController object for each model object. These objects are also
automatically updated every frame. The Animation class also handles all MIDI
triggering, assigning a midi note to each BalloonController object and calling
the nodeOn() and nodeOff() methods of this class.
Creating an animation is a simple, 2-step process: First you need to extend the
Animation class, which will serve as the base of this animation. Next, you
need to extend the BalloonController class. The animation class will
automatically create one of these objects for every Balloon model it receives
from the Sensing class. To specifiy that you want exaclty these objects to be
created, you need to implement the getNewBalloonController() method in you
subclass of the Animation class and return your subclass of
BalloonController in this method.
To change the midi note each balloon corresponds to is as simple as pushing
integers into the _midiNotes vector in your subclass of the Animation
class. The balloons wil automatically get midi notes assigned, but you can
customized these notes via this vector.
The animation with a sky of stars uses the OpenGL Vertex Buffer Objects to draw
the particle images. Each balloon draws around 1500 particles each, but I have
found I can go to around 80.000 total particles still having a framerate around
60 fps.
Source Code
on GitHub
23 April, 2010
People Watching Experiments
This is a video I did with Scott Waynes Indiana, Nik Psaroudakis and Nien Lam.
It shows us experimenting with placing objects on the sidewalk to redirect the
flow of pedestrians.
24 March, 2010
itouchyoutouch
Itouchyoutouch is a prototype for an interactive installation I did with
Nikolas Psaroudakis
in Open Frameworks. We wanted to create a way for people to interact and form
connections across time using spatial media. The installation consists of a
round table with a touch sensitive surface. When a user touches the screen,
particles will be emitted from the outline of the users hand. The movement of
the hand will be recorded and stored - along with the outline of the hand - in
an XML file.
This allows other users to touch the surface and interact with recorded touches:
When you touch the surface, the application finds the oldest touch and plays
this back along with the current users touch.
When a user moves his hand on top of and old touch, the particles emitting from
the 2 touches will establish gravity and start moving towards each other. The
color of the particles will also change, signaling a connection.
The installation is built to allow multi-touch and can be used by many users at
the same time.
17 February, 2010
Generative Logo 1
Lately I've been very interested in generative graphics: creating logos or
designs based on code visualizations. This is one of my attempts to create an
"R" using OpenFrameworks and the Box2D physics library. You can
download
the Source Code here