Concept
The UCSF Weill Institute for Neurosciences is a new research and treatment facility at University of California San Francisco’s Mission Bay campus. The Institute seeks to bring together world class researchers and physicians to solve fundamental problems in neurology, psychiatry, and neurological surgery.
UCSF enlisted the help of Obscura to create a unique display system that conveys the incredible scientific and therapeutic breakthroughs happening at the Weill Institute. We teamed up with doctors and researchers to re-imagine the way a clinical research facility communicates with the public. The team at Obscura developed a state-of-the-art volumetric installation that boldly renders the Institute’s achievements in a mesmerizing, larger than life holographic space. The Weill Institute for Neurosciences is set to open in 2020.
Development
This project brought together a cross-disciplinary team of designers and engineers to create a large scale, truly volumetric display. The display occupies an L-shaped section behind the curtain wall of the new building so our system had to accommodate an unusual, non-rectangular size and non-uniform density around the curved section. We also wanted to leverage Obscura’s artistic resources and support a traditional animator’s workflow of precisely animating rigged 3d geometry, not just content created by a programmer.
I designed the volumetric pipeline, real-time previewing, and playback systems using Houdini and TouchDesigner. Our system relies on a 3d point cloud of the LED display which is used throughout the pipeline. We used Houdini to create an artist-friendly pipeline to convert their animations to VDB, colorize them, and sample them using a custom slicing tool I created. The slicing tool progressively samples the VDB to a texture atlas at regular depth increments.
The texture atlas is used for both previewing and outputting to the LEDs using TouchDesigner. In Touch, the texture atlas is converted to a 3d texture, which provides bilinear interpolation along three axes. This allowed us to arrange LED’s in any shape or configuration, freeing us from rectangular formats. Our previewing tool instanced accurate models of different LED products we were evaluating in real-time, which helped inform the display’s design to optimize for readability, viewing angle, and occlusion. Working closely with our hardware team, I created a pixel map to convert the 3d texture into a 2d signal that the LED controllers expected, eventually resulting in a perfect representation of our content in an LED volume.
My work on this project led to a patent:
Position-Based Media Pipeline for Volumetric Displays
Team
Software Engineering Lead: Kurt Kaminski
Executive Producer: Sherri Nevins
Creative Director: Kaya Ono
Technical Producer: Jay Ho
Technical Director: Joe Martin
Hardware Engineer: Desmond Shea
Industrial Design: Hoss Ward, Patrick Vigorito