a few images from last weeks sketch book posted on tumblr http://ibmattroberts.tumblr.com/
My work Waves was listed in this article about art works that utilize real-time data. Make sure you check out the inspiring works listed in the article. I am honored to be amongst amongst such great works!
I recently attended the FILE festival in Sao Paulo, Brazil. There I exhibited my work entitled Waves and participated in the FILE symposium. I took a few shots of my work with my iphone cam so the quality is not that great but it will give you a sense of the installation. Honored to be part of such a great festival and to have exhibited with some fantastic artist.
This artwork responds to the current size and timing of the waves of the closet ocean of its current location. Every half hour the most current data from the closet ocean buoy station is downloaded. Custom software uses the current wave height and dominant wave period data from the buoy and transforms that information into a low frequency sound wave. As the size and timing of the waves in the ocean change so does the frequency of the sound waves produced by the software. These sound waves shake a bowl of water sitting on top of a speaker. This shaking produces wave patterns in the bowl that are captured by a video camera modified by the software and projected onto a wall. As the waves in the ocean change size and frequency the waves in the bowl will also change. This results in continuous variations of the shapes and patterns that one sees and hears which also reflects the constant changing conditions of the ocean.
Final projects from my class DIGA 231 Interactivity and Art. This is an introductory class where students learn how to program their own interactive software using MAX/MSP and how to use Arduinos. Students also learn how to use a variety of switches and sensors such as distance, light, pressure, knock, temperature, RFID and heart rate sensors.
Some test screen shots from a new piece I am working on that uses real-time wind data to generate new images
In November 2010 I participated in the Digital Arts Festival Taipei 2010. I was invited there by the 404 Festival and I presented my work Every Step. For the opening weekend I met with participants and outfitted them with cameras to create animations, you can see all the animations created by participants here.
I have a solo show coming up at the Duncan Gallery of Art at Stetson University. The show, entitled Waves Walks, features two bodies of work. A series of works that uses real-time wave buoy data as a means for generating sounds and images and a series of work based on walking. The opening is August 27 6:00-8:00pm and runs until the 28th of October. Below are some in progress installation shots.
The two pieces pictured below use real-time wave height and period data from wave buoys off the coast of Florida. Using custom software the buoy data is translated into low frequency sound waves. The sound waves shake objects such as bowls of water, these objects respond to the sound waves by creating abstract patterns.
I created software that abstracts images I took while swimming in the ocean. Each projection has a design element which changes size according to the current size of the waves off the coast of Florida.
Last summer I worked with students, Hogan Birney, Sean Kinberger and David Plakon on creating an interactive live audio/visual performance for Mirror Pal’s CD release party. The students asked me to help them develop a multimedia performance for the release party and I was more than happy to help. We developed a live multiple camera setup for the stage performance of the band which allowed them to mix live stage shots, prerecorded video clips and realtime video manipulation. To do this we modified affordable security cameras to be easily placed on stage and created a mixing station to easily switch between the cameras. We also created our own software to mix the live footage with prerecorded clips and add effects in realtime. Audience members could also submit text messages which were mixed with the live images and projected during the performance. We also created an interactive photo booth that audience members could sit inside and create short animation that were used during the performance of the band. The project was very ambitious for three students but they did an outstanding job. Here is some video they created to document the event.
MPG: Mobile Performance Group was invited to perform at the Intermedia Festival hosted by Indiana University Purdue University Indianapolis. I am working on an interface for the iphone/itouch, using the OSC based mrmr app. I developed an interface that allows the public to control the manipulation of live video and send text messages which becomes part of the live video projection. Users are be able to do things such as mix video, choose video clips, apply effects, and use the iphone’s accelerometer to rotate and position the text and image. The festival was a blast and I will post some documentation soon.