“Such a poetic way to communicate some very technical messages. Interacting with fireflies is a perfect metaphor as well.” —juror Gabe Kean
“Stunningly beautiful and well executed. I especially loved the approach to showing off pixel pitch.” —Drew Ungvarsky
Overview: Technology and nature can be unexpectedly connected—take electroluminescence and bioluminescence, the similar methods by which LEDs and fireflies emit light. Captivated by this discovery, interactive studio Second Story, part of SapientNitro, worked with audio-visual company Electrosonic to create a responsive environment choreographed out of multiple LED surfaces. During an event held at the LED Lab in New York City, more than 200 guests “caught” fireflies on an interactive touch table and “released” them to the dark LED surfaces peppered throughout the room. Releasing the fireflies brought the canvases to light—the more fireflies guests released, the brighter the space became.
•The project celebrated audio-visual company Electrosonic’s 50th anniversary.
•Second Story had to use the variety of uniquely shaped screens already present at the LED Lab.
•The project took six weeks to produce.
Comments by Chris Carlson, interactive developer, and Adi Marom, associate creative director, experience design, both of Second Story, part of SapientNitro:
How many videos, images and other media elements does it have? “For each of the three firefly ‘scenes,’ we created a photo-realistic animated landscape that enveloped the room across all of the LED surfaces. Each animation consisted of multiple layers stitched together into a parallax loop that orbited the space, resulting in a continuous digital environment. For the touch table, we produced an animated flock of fireflies consisting of ten varieties for guests to choose from; they differed in color, form and motion. The interface simulated the pixel pitch of each LED canvas as a mask above the fireflies, enabling users to experience what a firefly would look like on the different screens. When a guest released a firefly to one of the displays, the firefly was animated into the selected surface, overlaying the environment’s animation.”
What software, back-end technology and programming languages were used? “The software consisted of two apps written in openFrameworks/C++. One app ran on a large multitouch table and enabled visitors to ‘catch’ fireflies, learn more about different LED pixel pitches and send their firefly to a specific LED surface within the space. The second app managed the firefly environments, assigning incoming fireflies to areas of the screen corresponding to the different LED surfaces around the space and scaling the animations for the appropriate resolution. A D3 engine was used to transform the output of the second app to the required signals for each physical canvas.”
Are there any other technical features you’d like to call attention to? “We added an iPad application and distributed a few iPads in the space during the event. They featured additional information about the different LED surfaces, specifically, pixel pitches and ideal viewing distances. The purpose was to put another educational tool and contextual layer in the hands of the guests in the space.”