UW Husky Dawg Daze 2017 Interactive Photo Booth
Husky Dawg Daze is a series of events for incoming freshmen at University of Washington. Myself and collaborator Julie Cruse created this photo booth to give the students an interactive way to commemorate this exciting time of their life.
It’s the first week of campus orientation for incoming freshman. The UW campus is a promising new adventure, rich with opportunity and reinvention as an adult, an academic, and community member. Born between 1998 - 1999, this class has always known a connected life with mobile devices, social media, apps, selfies, wifi, and streaming video. The energy of this pivotal new moment is abundant, and they are sharing this experience with new friends that could last a lifetime.
On this fine evening, they will be attending a dance party - a celebration for this new and exciting moment in life.
Instead of a just selfie, let them take a “groupie” photo, as a memento for the occasion. They can track it down on social media, share it, and commemorate the experience. With video effects, they can remix the photo and create something unique that also reacts to the environment ie. “Snapchat filters”, but more esoteric. This interaction should raise questions about the creative process.
Most Viable Product Specification
- Photo booth
- Pushes photos to Social Media
- Video effects that are interactive for subjects/users
- Does not use touch screen
- UW Branded
- Visually interesting as a live feed broadcast/video projection
New students took nearly 1,400 photos over the course of 4 hours during the event! A live feed of the photo booth was broadcasted and projected to the inside and outside projectors via UDP. The installation was an attraction and many students enjoyed it! Mission accomplished.
We tracked our progress in our very short development cycle in sheets. This is what it our final responsibilities looked like:
I was involved in almost every part of the process for this project. Designing and building this project was ambitious given the time frame. I was able to pull it all together after some proofs of concept and failures. Only a few of our original features did not make it. The concept was formed over about five one-hour meetings and programmed in roughly 30 hours. We had two evenings of tech rehearsals prior to the event. These proved critical to the success of the project.
Created with Max MSP in Jitter, I was able to bring together motion tracking from the dp.kinekt external package and an excellent tutorial from the Cycling74 website on the jit.gl.mesh object. Data from the hand positions allows the user to displace the polygons like a broken mirror. When all hands in the scene are detected above a certain threshold on the Y-axis, a countdown timer ensues to capture a still moment. Jit.gl.node was utilized to composite all OpenGL and 2D video layers, including the branding. The microphone input was set to a loudness threshold that detected the beat of the music, and each time the music or sound peaked above that threshold, the polygons would change shapes randomly.
Video was networked and sent via UDP over WIFI to projectors inside and outside the venue. This lured users to the booth, creating an interesting spectacle.