“The simplicity and elegance of using fans triggered by the subway to activate the fabric made this stand out, especially in a crowded landscape of screen-based window displays.” —juror Keri Elmsly
“Simple and relevant to the physical space.” —juror Harold Jones
Overview: Think of New York City’s clothing store windows, and your mind will inevitably jump to an image of well-dressed mannequins. But for DKNY’s collaboration with the museum-led incubator NEW INC, interactive designer Karolina Ziulkoski decided to bring a different kind of window installation to life. Complete with a bench, curtains and monitors displaying an idle subway platform, the installation sensed whenever someone hurried by on the sidewalk. Suddenly, the monitors would switch to a video of a train approaching the platform. Fans billowed the curtains to reveal mannequins. And inside, entranced passersby could sit on a bench to listen to a patchwork of stories heard in the New York City subway.
•A Kinect sensor tracked the movement of patrons passing by on the street.
•When movement was recognized, a video of a subway train played, and fans billowed curtains, revealing DKNY mannequins.
•The project took two and a half months to complete.
Comments by Karolina Ziulkoski:
What are the project’s core features? “The piece works because of its simplicity—movement generates wind, wind blows curtains. So the interaction is triggered by something that makes sense to the content. There is no need for instructions as it engages people by using what they are already doing: walking on the sidewalk. The reveal of the mannequins is like a ‘secret’ that you activate.”
How many videos, images and other media elements does the project have? “There are two subway videos, which alternate between the idle platform—when there is no movement—and the train passing. This media establishes the subway context, helping people to make the connection with the wind generated by subway cars as they move in the station. The other media component includes the audio stories you hear inside the store when sitting on a bench similar to the ones in the New York City subway. The goal is to evoke what happens on the platform as you hear other people’s stories by chance.”
What software, back-end technology and programming languages were used? “I used Arduino and Max/MSP, tools I’ve used before. The movement sensing was done directly in Max/MSP, just by analyzing the camera’s live stream. The novelty of this project was in the hardware: I used a relay for the first time, since I was controlling 120-volt fans with an Arduino that can power only five volts. I chose a solid state relay mainly to avoid the clicking sound when it turns on and off—since it would be activated many times throughout the day, it would be quite annoying for store associates.”