“Evolution is exactly the right word. It’s been a transition from film to electronic imaging for 40 years,” says Stephen Lighthill, Senior Filmmaker-in-Residence of the American Film Institute, reminiscing about the emergences of digital and the decline of analog film, which began in the late ’70s. Back then, Lighthill was serving as a freelancer for 60 Minutes and just made the switch from 16mm film to analog. At that same time, Steven Sasson of Kodak presented the first digital camera to the company.
Sasson’s camera pitch was shelved and years passed before first digital cameras emerged. After many refinements to the technology, Sony released the first digital video camera in 1995.
By the late ‘90s, the digital cameras were still far inferior to Hollywood’s standard. So, when George Lucas decided to shoot the fourth installment of his Star Wars franchise, Episode I: The Phantom Menace, on digital, he had to push Sony and other companies to develop a quality of film that met his needs. At the time, Sony’s cameras were not shooting at 24 frames per second, Hollywood’s standard rate, plus he needed the format to be compatible with the available editing requirements.
While Lucas wasn’t able to get ahold of the camera–and accompanying lenses from Panavision—-in time to shoot Episode I, the director would put them to use for Episode II: Attack of the Clones.
In 2002, Lucas shot Episode II entirely on digital cameras. However, when the movie was shown in theaters, it was mostly projected on 35mm film. Only 20 screens in America could support digital projectors and showed Episode II compared to 3000 screens that showed the movie on film.
Roger Ebert was not impressed by the film’s quality. In his first review, he watched the movie on 35mm and said it exhibited “a certain fuzziness, an indistinctness that seemed to undermine their potential power.”
However, when he saw the movie projected digitally, he revised his opinion, saying it was sharper and punchier than on film. The critic forecasted the potential in the digital filmmaking.
Lighthill explains that the digital cameras had many issues in the early 2000s, although quality was not the primary problem. They were clumsy, didn’t have many options for lenses, and editing content was difficult. Nevertheless, the biggest burden for digital was how to show it.
At the time, most theaters were still projecting on film. Many of the LA studios formed the Digital Cinema Initiative (DCI) to pave the road for the movie industry to convert to the format. The DCI tested digital projectors in 2002, which, as Lighthill explains, was the moment when people in cinematography ”realized that everything was going to be different as soon as digital projectors” entered the marketplace.
The speed is of such a transition is still widely considered impressive, and some consider the effort of keeping up with these changes futile. Lighthill explains that before digital, a camera would be relevant for many years. The body might become a little lighter, or it would function slightly more efficiently, but overall, the changes were minor. The current speed of evolution is break-neck.
Austin Reza, the Creative Director of the LA based advertising company Reza & Co, started shooting on the analog film Digital8 in the late ‘90s. He tells BTR a little about how he tries (not) to keep up.
“I’m not a person who stays up on the bleeding edge of technology,” he says. “To me that’s like a writer or illustrator talking about the latest pen technology.”
Instead, Reza found a camera he loves and he carries it with him everywhere–his Panasonic GH4 which shoots 4K cinematic video (4096 x 2160 pixels). After learning film on analog tape, he finds the resolution of modern cameras to be baffling.
The speed of such technological evolution also transforms the fundamental “rules” used to establish the quality of a camera. The common idea that sensor size in cameras equates to image quality is beginning to become outdated. Today, small cameras with small sensors take wonderful pictures.
Also, the idea that bigger cameras take better pictures is no longer true. Digital SLR cameras are being replaced by mirrorless ones, which take pictures that are mostly on par with DSLRs, but are smaller, lighter, easier to maintain, and have better frames per second. Better digital cameras don’t just mean better quality, but offer novel techniques film did not.
Digital also allows filmmakers to accomplish brazen goals that Alfred Hitchcock set 60 years ago. Had the director possessed a digital camera back in 1948, he would have been able to make Rope in one long scene. Perhaps it was an impressive dream, but in reality, Hitchcock was limited to the capacity of shooting only 10 minutes at once–the length of a roll of film. Rope ended up encompassing 10 takes, so to make the movie look like one long shot, he hid the cuts into the movie, editing the transitions in subtly by having the camera zoom into someone’s back.
It wasn’t until digital that Hitchcock’s cinematic dream could be realized. The 2002 96-minute movie, Russian Ark, was shot in front of 2000 actors, in one take and zero cuts. It is the longest shot in film history, and it was the first feature film created in a single take.
Just as supporters of film honor is unique look, others enjoy the accessibility that digital offers.
“Our latest Bentley commercial was a great example of choosing a format based on creative needs,” explains Reza. “We chose [to film with] the iPhone because it was as much a character in the story as the Bentley Mulsanne we were showcasing.”
The concept, Reza explains, was to show the integration of design and technology.
Despite the incredible advancements of digital accomplished each year, Lighthill still sees that the form as a “compromise” over film.
“There is a gentler quality with skin tones that you get with film that you don’t get with digital imaging,” he insists. He also points out a pitfall of digital known as the Fixed-Pattern Noise, a grainy texture caused by certain pixels being overexposed.
Many praise digital for its financial savings over film, however, Lighthill is critical of the unexpected expenses digital actually creates. Because of the rapid changes in the industry, companies need to recover the cost for research and development very quickly. Buying the newest camera is extremely expensive and the model’s value is meaningless when the technology becomes obsolete almost immediately.
In addition, the ease of shooting on digital lets photographers take more pictures, filmmakers shoot more footage, and directors to shoot without cuts between takes. Actors also relax on the set, which can lead to additional takes. Because of the increase of footage with digital, many studios and news agencies are forced to hire crews of editors, whereas during the days of film, they used to only staff one. Overall, Lighthill explains, switching to digital is not cheaper.
Many agree with Lighthill’s preference for film, however, a filmmaker’s taste is not the only deciding factor. Budgets, time constraints, how the commercial or film will be distributed are among the numerous aspects that determine whether footage is film or digital. Not many filmmakers or photographers are able to practice using their format of choice.
Digital constraints affect those in film school. For AFI students, only 15 percent of their work is shot on film, a trend that worries Lighthill.
Shooting on film is fundamental to imagining the way a scene will look like through the lens–a skill that helps cinematographers set up the cameras to capture the scene. Also, the stakes are higher on film sets, which trains students to keep an organized and maintained environment.
When speaking on film and digital, Lighthill often uses the word evolution, connoting that it’s a natural, expected step. He refuses to call his preference for it nostalgic, and credits digital for its strengths.
Each year, film demand crashes, and digital improves. Will film ever disappear completely?
“That’s impossible to know,” responds Lighthill, “but it won’t be in the foreseeable future.”