Mar 24, 2012

The Conundrum of Preserving Digital Films



Observations On Film Art has done a phenomenal 5-part series on the transition of film to digital. Pix and Pixels focuses specifically on preservation (a topic I'm enormously fascinated with), and is my favorite part of the series.
The Science and Technology Council of the Academy recently published its second comprehensive study of “the digital dilemma” and were surprised that most of the filmmakers they interviewed were unaware of how perishable their work was. Says Milt Shefter, an author of the report:
They were concentrating on the benefits of the digital workflow, but weren’t thinking about what happens to their [digital] masters. They’re structured to make their movie, get it in front of an audience, and then move onto the next one.

I asked Chris Horak of UCLA to imagine a scenario in which a future cache of digital movies has been discovered in an obscure place, permafrost or no permafrost. He answered:
If I found a reel of 35mm film in 500 years and didn’t know what it was, I could probably without too much trouble figure out a way to reverse engineer a projector. In any case, I can always look at the individual frames, even without a projector, and see what is there.
If I find a cache of Blu-rays and DCPs in 500 years, what do I have? Plastic waste. How do you reverse-engineer those media? Impossible. Without an understanding of the software and the hardware, you have zip. No way to look at it, no way to know even if it has any information on it.


It’s hard to get your mind around the scale of the problem. Here is Ken Weissman of the Library of Congress:
Speaking very broadly, with 4K scans of color films you wind up in the neighborhood of 128 MB per frame. . . . Figure that a typical motion picture has about 160,000 frames, and you wind up with around 24 TB per film. And that’s just the raw data. Now you process it to do things like removing dust, tears, and other digital restoration work. Each of those develops additional data streams and data files. We’ve decided, based upon our previous experience, that it is best to save the initial scans as well as the final processed files for the long term. Now we are up to 48 TB per film. In our nitrate collection alone, we have well over 30,000 titles. 48 TB x 30,000 = 1,440,000 TB or 1.44 EB (exabytes) of data.
Weissman adds with a trace of grim humor: “And of course you want to have a backup copy.”