View App
“I was very impressed with the level of execution on this project. The content pairs very nicely with the 3-D video interaction—the user feels compelled to be looking constantly in all directions, anxious and excited to see what happens next.” —juror Gabe Kean
“The execution was incredible. I watched it twice and still have a lot of questions about how it was done. Virtual reality is still an emerging channel, and this project pushes the limits of what’s been done with the medium.” —juror Drew Ungvarsky
Overview: It’s a classic tale of heroines, heroes and aliens, told in a not-so-classic 360-degree environment. “HELP” is the fourth story in Google Spotlight Stories and the series’s first immersive, live-action film made uniquely for mobile. Viewers can use their smartphone as a window to control their immersion into the “HELP” world—simply by turning their phone in a new direction, they can set a new pace, focus on any part of the scene or frame a different shot.
•More than 80 people from the Mill collaborated with director Justin Lin, Lin’s production company Bullitt, and Google’s Advanced Technology and Projects (ATAP) team, which developed the Spotlight Stories app, to produce this project within thirteen months.
•The final piece ran nearly 10,000 frames as a single image sequence, and the final delivery was in cubic map format. Having six sides, it totals nearly 60,000 frames.
•It is available for free with the Google Spotlight Stories app on Android and on iOS via Google Play and the iTunes App Store.
Comments by Gawain Liddiard and Daniel Thuresson:
What do you think are the project’s core features? “All of the software in our industry is geared toward viewing the world from a rectilinear angle, so an immersive project like ‘HELP’ breaks almost every part of a traditional production pipeline. A camera rig was developed that combined four Red cameras, capturing four viewpoints that would be stitched together to create a 360-degree environment. Also created was the Mill Stitch, an on-set virtual reality toolset tailored for high-end cinematic production. Its features—including live video ingestion, lens dewarping and stitching across multiple cameras—enabled Lin to monitor the 360-degree camera arrays in real time using a joystick, a virtual camera and a touchscreen. Finally, the ingestion of 6K image sequences from each of the four cameras, along with a huge amount of visual effects, enabled us to create this continuous five-minute shot in cinematic quality.”
Describe any special interactive features. “The video is designed for viewers to find their own perspective within the media. There’s no way of controlling where the viewer is looking—no way of hiding or getting around issues. For example, if the viewer chooses to investigate the background scenery, that has to hold up as much as the monster in the foreground. This robbed us of a lot of traditional cinematic techniques to guide the viewer. So for this project, Google’s ATAP team built in very delicate aides that guide the viewer toward points of interest to lock onto key characters and objects. There is even an ambisonic sound track that is decoded into a binaural sound track; based on the viewer’s point of view, it helps immerse viewers into the sounds happening all around them.”
What software, back-end technology and programming languages were used? “A lot of proprietary software was written to work in conjunction with our core off-the-shelf packages: 3-D computer graphics software Maya, 3-D animation software Houdini (rendered through Mantra), 3-D rendering application Arnold, digital compositing application NUKE, 3-D texture painting application Mari and commercial rendering plug-in V-Ray. All of this software was supported by the Mill’s internal pipeline, and we elevated our hardware capabilities to support the amount of data and heaviness of the rendering. In addition, we used stand-alone tools like Mill Stitch and delensing software. We also created our own review tools to run on tablets and smartphones, along with plug-ins for RV, our main review tool, which enabled us to view work in 360 degrees and distribute it to clients and artists in a way that showed them the media as it was going to be seen in the final device.”