Last summer I worked with students, Hogan Birney, Sean Kinberger and David Plakon on creating an interactive live audio/visual performance for Mirror Pal’s CD release party. The students asked me to help them develop a multimedia performance for the release party and I was more than happy to help. We developed a live multiple camera setup for the stage performance of the band which allowed them to mix live stage shots, prerecorded video clips and realtime video manipulation. To do this we modified affordable security cameras to be easily placed on stage and created a mixing station to easily switch between the cameras. We also created our own software to mix the live footage with prerecorded clips and add effects in realtime. Audience members could also submit text messages which were mixed with the live images and projected during the performance. We also created an interactive photo booth that audience members could sit inside and create short animation that were used during the performance of the band. The project was very ambitious for three students but they did an outstanding job. Here is some video they created to document the event.
MPG: Mobile Performance Group was invited to perform at the Intermedia Festival hosted by Indiana University Purdue University Indianapolis. I am working on an interface for the iphone/itouch, using the OSC based mrmr app. I developed an interface that allows the public to control the manipulation of live video and send text messages which becomes part of the live video projection. Users are be able to do things such as mix video, choose video clips, apply effects, and use the iphone’s accelerometer to rotate and position the text and image. The festival was a blast and I will post some documentation soon.
Last Month I presented my project Every Step at City Centered – A Festival of Locative Media and Urban Community The festival was located in the Tenderloin District of San Francisco and featured some great artist. A great event in an incredible city, I miss it already!
Below are a few videos from the final performance of a class I taught called Collaborative Multimedia Performance. In this class students learn how to collaborate with students from different majors, Music, Art, Computer Science. Students were taught a variety of techniques for live performance using electronics and software.
For this assignment students used contact mics to turn an object into an instrument. They created their own contact mics and attached them to the table and bottles to create a percussive instrument. The microphones were run through some guitar pedal effects and amplified. Students, David Plakon, Sean Kinberger and Zeb Long.
Student Ian Guthrie performs under the name Benny Loco and Uncle Abuelito. For this performance Ian teamed up with student Jana Fisher to create a visual accompaniment for his music. Jana learned how to create her own VJ software that allowed her to manipulate clips from the Twilight Zone to accompany his music. Jana built her VJ software using the Max/Jitter programing environment.
Final projects from my DIGA 231 Interactivity and Art class. Students in this class learn how to program their own interactive software using MAX/MSP and how to use Arduino boards to create a link between the physical and digital worlds. Students also learn how to use a variety of switches and sensors such as distance, light, pressure, knock, temperature, RFID and heart rate sensors.
This artwork responds to the current networked devices, such as cell phones and laptops, in its vicinity with sounds and imagery of the character Lemmy Caution from Jean-Luc Godard’s film Alphaville. A computer running networking sniffing software monitors the area for wireless data such as bluetooth and wireless internet signals. When a new wireless device enters the vicinity the piece will respond by announcing the wireless device’s name and if it is considered to be a threat. This piece requires one Macintosh computer with wireless and bluetooth, custom software, video projector and sound speakers.
A metal shopping cart converted into a mobile interactive audio/visual instrument. Touch and pressure are used to control the live manipulation of sound and image. The cart is equipped with a video projector, computer and battery making it portable and self contained. Using a microprocessor (arduino) and custom software (max/msp/jitter) to sense the users touch and translate the pressure of the users touch, a real-time response is created both visually and sonically. The cart is used by MPG performers and the audience is also encouraged to play the cart as well. This project was created by students Hogan Birney, Sean Kinberger and David Plakon under the direction of Matt Roberts at Stetson Universities Digital Art Program.
A small cardboard box converted into an interactive instrument. Inside the box is a wireless device sending motion data (wiimote). Users can turn the box to select between images and text. Rotation of the box also turns and scrolls the image in real-time. This project was created by Mobile Performance Group under the direction of Matt Roberts at Stetson University’s Digital Art Program.
Audience members of a MPG: Mobile Performance Group performance were asked to contribute to the performance by sending text messages. Participants messages were used as performance material and projected onto public spaces. This project was created by student Derick Ostrenko under the direction of Matt Roberts at Stetson Universities Digital Art Program.
Every Step allows a participant to create a short experimental animation while they walk. Each participant is given an armband with a mounted camera and pedometer. The pedometer is mounted inside the armband and is connected to the camera. The camera is mounted on the armband and points towards the sky. The pedometer acts as a trigger for the camera and an image of whatever is above the participant is taken every time a step is made.
To create an animation the participant simply puts on the armband and takes a walk wherever they would like to go. When the participant returns from the walk the images are transferred from the camera’s memory and loaded into a custom software program. The software program uses the images to create a frame-by-frame animation and to create a soundtrack for the animation. When the program completes the animation a DVD is made and given to the participant.
Every Step reviewed by We Make Money not Art “The techy work that really charmed me by its simplicity, poetry and melodies was Every Step”