Film Industry, Technology of

views updated

FILM INDUSTRY, TECHNOLOGY OF

During the preproduction stages of a feature film, the screenwriter, director, production designer, and cinematographer may have widely differing visions concerning the ultimate look and sound of the film. Each scene has a series of variables that must be addressed prior to setup and shooting. Decisions about the technology that is to be used during the principle shooting will affect what the audience ultimately sees at the multiplex. Though the director is responsible for the finished product, the key players on the production team are hired for their expertise in the technical craft of filmmaking.

Format and Film Stock

The first decision to be made regarding the technology of a feature film centers on the screen format, which is the ratio of a film's width to its height (i.e., the aspect ratio). All pre-1952 films and most non-high-definition television (non-HDTV) programs have aspect ratios of 1.33:1—or 4 (width) by 3 (height)—which is the same aspect ratio as traditional television screens. It would be unusual for a contemporary film to be shot using the 1.33:1 format, given that contemporary movie theater screens are made to accommodate wider screen formats. Films shot on digital camcorders, including Michael Moore's documentary The Big One (1997), are exceptions. Moore's decision to use the digital camcorder in his guerilla-style documentary led him to use the 1.33:1 aspect ratio.

The more common format choices are the Cinema Scope ratio (2.33:1) and the non-anamorphic ratio (1.85:1). The content of the film has much to do with the ultimate decision about format. The film Titanic (1998), which featured complex action sequences, virtually required the use of the 2.33:1 format. Conversely, American Pie (1999) was a teenage comedy that kept its characters in tightly framed shots. Therefore, the non-anamorphic ratio was deemed better suited for that film's format. In addition to film content, the equipment that is available will be a factor that helps to determine the final screen format.

Prior to shooting, the director and cinematographer must make a technical decision concerning what film stock to use. The film stock will have a significant effect on the look and feel of the film. Blair Witch Project (1999) combined video and grainy 16-mm film to create a realistic, low-budget look. A more traditional approach is to shoot in the Super 35-mm (Super35) format to reduce grain and capture superior contrast ratios. Lower budget films and documentaries might be shot on Super 16-mm (Super16) film.

A decision might also be necessary if a film is going to mix daytime and nighttime scenes. Some cinematographers prefer to use the same film stock throughout the production and use filters to create night sequences during daylight hours. Other cinematographers are adamant about shooting night sequences at night, so they may decide to mix, for example, 200-speed film for the daytime sequences with 500-speed film for the nighttime sequences. Technical decisions about formats and film stocks may seem mundane, compared to other aspects of the filmmaking process, but the choices made in this area will have a direct effect on the decisions that must then be made about cameras, lenses, and lighting.

Cameras and Lenses

Camera selection may appear to be a difficult procedure; however, the choices that are available to filmmakers are somewhat limited. There are only a handful of professional camera makers. The major camera manufacturers include Panavision, [.approxequal] aton, and Arriflex. Once the format decision is made, the cinematographer looks at what cameras are available. The ultimate decision will be made based primarily on the cinematographer's experience and preference. With camera and format in mind, lens choice is the next major issue.

Several lenses are used on a feature film, and the decision to use a specific lens is based on the action, composition, and lighting of a particular scene or shot. Unlike cameras, there are many makers of lenses and a wide variety of lens types. The primary companies that produce lenses are Angénieux, Zeiss, and Leitz. For many directors, the decision to shoot Super35 film is based on being able to use spherical lenses. Introduced in the late 1980s, these lenses have great depth of field, feature very wide contrast ranges, and have exceptional resolution performance. The Primo spherical lens has had such a profound effect that its inventors were awarded a Scientific and Engineering Award from the Academy of Motion Pictures Arts and Sciences in 1998. In addition to spherical lenses, a wide variety of fixed-focus lenses and special-purpose lenses can be used by the cinematographer, with the final decision being based on the needs of the director in a given scene. The director, cinematographer, and lighting director must determine, as well, the number of cameras to be used on the shoot.

Prior to the 1980s, almost every motion picture was shot using the single-camera method. The single-camera approach is exacting and methodical. The first step in the process is to shoot the master scene that captures all of the essential action and dialogue. More detailed shots, including close-ups, medium close-ups, and reaction shots, are then shot individually. Each new camera setup is time consuming. A large number of lighting changes and camera setups can often lead to production delays. Throughout the 1990s, the pressure to reduce the shooting schedules of feature film projects led to the increased use of the multicamera technique.

The multicamera technique allows the director to shoot a scene with two or more cameras. Recording the master scene and close-ups simultaneously can make the editing process much easier and generally saves setup time. The multicamera method places demands on the lighting director to light the scene in a manner that accommodates two or three cameras that are shooting simultaneously. When used effectively, the multicamera approach can help the director to trim days from the shooting schedule. While the orchestration that is involved with moving two or more cameras is important, lighting is the key component of the multicamera method.

Lighting

It is safe to say that lighting style and technique set the mood of a scene and can direct the attention of an audience to some desired element within the frame. The depth of field, or number of elements that can be held in focus by the camera, is largely dependent on the lighting. Table 1 provides some of the most common lighting devices and their uses. The lighting crew is responsible for all aspects of lighting technology during production.

The lighting crew consists of the lighting director, who has the primary responsibility for creating the look that is called for by the director; the gaffer, who is the electrician who sets up the lights; and the best boy, who assists the gaffer. Their job collectively is to control the lighting that falls into the camera frame.

The color quality, or the relative amount of red and blue of a light, is measured in color temperature. The color temperature emitted by all light sources, natural or human made, has an effect on the look of the scene. Film stock and lights have been created to take advantage of various color temperatures. Color-negative film stock is designed for exposure at a color temperature of 3,200 degrees Kelvin (K). Many spotlights, called Fresnels, produce the required 3,200-K light. Natural, outdoor light is generally about 5,600 K. Commonly used metal halide arc (HMI) lights produce light at 5,600 K and, therefore, equate natural light. If the camera crew is told to produce an indoor scene at 3,200 K, it could opt to use only Fresnel lighting fixtures and block out all other light sources. It is more common for the crew to work with Fresnels, HMIs, and natural light pouring through existing windows. In such mixed lighting situations, the crew will use gels, reflectors, flags, and a number of other accessories to balance the color temperature of all sources to the desired level. Controlled, professional lighting is generally the key to a professional-looking project.

Sound

To Star Wars creator George Lucas, sound is 50 percent of the film. When the sound system fails at a movie theater, people complain vociferously. The art of sound for motion pictures is challenging. A single action sequence in Titanic might feature a mixture of realistic ship sounds, crisply recorded dialogue, background cries of passengers, and original music that establishes the mood of the scene. The mixing of audio is handled so expertly that it may belie the fact that almost all of these sound elements were created separately, rather than as part of the actual filming.

Common Lighting Devices
TypeDescription
FresnelA spotlight that is used as a main source of illumination.
Soft LightAny lighting fixture that reflects light to fill in shadows caused by Fresnels.
KickerA smaller light that highlights one aspect of a set or an actor.
BarndoorA four-leaved metal attachment to a light that restricts the light's direction.
ScrimA wire mesh screen that is placed in front of a light to restrict the light's intensity.
GelPlastic material that is placed over a light source to alter itscolor or temperature.
ReflectorLarge, flat device that is used to bounce light from one source to another.
Gobo/FlagLarge, usually black, objects that are used to block light from entering the set.
C-StandA clamp device that is designed to hold reflectors, gobos, or other devices.
Light MeterA device that is used to measure the light in a given scene.

During the production, the sound crew uses audio technology to meet their initial goals of consistency of audio and clarity of dialogue. To capture dialogue, the crew may employ tiny, wireless lavaliere microphones that are worn by actors, or they may decide to use the more common boom microphone. The boom operator's job is to ensure that the boom microphones are directionalized toward the actor, yet do not enter the frame of the shot. They must also be consistent in the distance that they establish between the microphone and the actor throughout each take of each scene.

All of the location film sound is recorded on a digital audio recorder and on film. This process is referred to as the double system. It allows the audio team to work on mixing dialogue, music, and sound effects independently in postproduction. Because many scenes in a film may be shot without sound, the supervising sound editor has the responsibility of creating a realistic audio mix for the audience. While the audio postproduction team is working on sound stages, dub stages, and audio control rooms, another team of artists works on the visuals.

Editing and Visual Effects

Once the film is shot, two separate processes will generally occur with the acquired film. The first is editing. Few directors still cut actual film. Steven Spielberg is one exception to this rule. He uses Moviola and/or Steenback film editors to cut together a final "master" print. However, this is becoming increasingly rare. The most common technology that is used in contemporary film editing is a computer-based, nonlinear editor. The film is "digitized" or transferred from film to computer data. Many film editors use the Avid brand of nonlinear editing. The Avid systems and similar computer-based editors provide a series of editing options, including cuts, fades, and dissolves. The nonlinear system allows directors and editors to make last-minute changes with ease because changes can be made with literally one or two keystrokes at the computer workstation.

The craft known as visual effects was at one time primarily associated with science fiction and high-budget films. It has since become commonplace to use computer-based visual effects in almost every type of film. The possibilities are limitless to a visual-effects team. The processes of color-correcting elements within a scene, removing scratches from a film or adding scratches for effect, and "airbrushing" out unwanted elements from a scene pose little problem with modern technology.

When reshooting a scene is impossible, the visual team can save it in postproduction. For example, in one scene in the film Braveheart (1995), it appears as if thousands of extras are storming across the battlefield. A visual-effects team took the somewhat limited-looking groups of extras that were actually filmed and simply copied and pasted the extras digitally until the battlefield appeared full. The result is both convincing and one of the film's highlights. More complex films, such as The Matrix (1999), feature a dizzying array of visual effects. Many of the film's key sequences were shot with actors in front of green screens. The effects team had to use digital compositing to create futuristic and compelling action sequences. The death of a performer during production, such as when Brandon Lee died on the set of The Crow (1994) or when Oliver Reed died during the production of Gladiator (2000), might force a studio to terminate or recast a project. In these two cases, however, the scenes that remained to be shot for these two actors were completed with the use of computer-generated "virtual" actors.

The pace of computing technology has been so remarkable that entire films are being created on the computer desktop. Toy Story 2 (1999) is an example. The film's preproduction, production, and postproduction all took place in the digital domain. The film was even projected at some theaters from a computer hard-drive that was connected to a filmless digital light processing (DLP) projector. The progression of film away from analog toward a digital base is one of several key trends for the industry.

Trends

Several key developments in film technology are changing the production and postproduction process in contemporary filmmaking. These trends are affecting the very heart of the films seen at the multiplex.

Cameras

The physical size of film cameras continues to decrease. This allows directors much more flexibility in obtaining the shots that they need. Because cameras are lighter than ever before, hand-held shots are being used more effectively than at any other time in the history of cinema. The audience may not realize that shots are hand-held, thanks to improvements made in Steadicam products that counterbalance the motion of the camera operator. Smaller cameras can also be attached to radio-controlled devices to follow the action of a scene. As cameras become smaller, the types of shots that are possible increase dramatically.

Video assists are being used more commonly to monitor camera shots. Instead of waiting for the "dailies" to return from the developing lab, directors watch a video monitor to see if a shot is acceptable. Some video assists have a freeze-frame capability that allows the director to call up a previously shot scene and reset all of the scenic elements and actors to ensure continuity from take to take.

A closely followed area in camera technology is that of the "filmless" camera. High-definition cameras are being tested that record images either on digital videodiscs (DVDs) or directly to hard drives. The use of such cameras would allow for instantaneous review and preliminary editing of material while eliminating film and developing costs. Studio pressure to release film content at a faster pace could lead some directors toward digital film acquisition in the field.

Audio

In the realm of sound, film continues to move closer to an immersive experience for audience members. Three formats of digital audio bring six or more channels of audio to moviegoers. The three formats are:

  1. Digital Theater Sound (DTS), which is a six-channel digital audio system that reproduces audio from a separate compact disc (CD) that is synchronized to the film,
  2. Sony Dynamic Digital Sound (SDDS), which was originally a six-channel digital audio system but was expanded to an eight-channel system in the late 1990s, and
  3. Dolby Digital Surround, or Spectral Recording-Digital (SR-D), which, following the trend toward more immersive audio, improved the original six-channel system by adding speakers at the back of the theater.

Audio engineers continue to experiment with innovations that might result in audio emanating from the ceiling and floor of the theater.

Digitization

The progression of film toward a more fully digital medium seems obvious. Audio production, postproduction, and exhibition continue to be largely digital. Visual effects, titles, and editing are almost predominantly handled at the computer desktop. Filmless DLP projection systems have proven successful in theaters. Home video continues its progression toward DVD and Internet-delivered film content. As a result, the only major analog component of the filmmaking process is the actual shooting of the film. It is not inconceivable that the film technology of tomorrow will be fully digital.

See also:Digital Communication; Digital Media Systems; Film Industry; Film Industry, Careers in; Film Industry, History of; Film Industry, Production Process of; Recording Industry, Technology of.

Bibliography

Brinkmann, Ron. (1999). The Art and Science of Digital Compositing. San Francisco, CA: Morgan Kaufmann Publishers.

Eastman Kodak Company. (2001). "Entertainment Imaging." <http://www.kodak.com/US/en/motion/index.shtml>.

Happé, L. Bernard. (1971). Basic Motion Picture Technology. New York: Hastings House.

Jones, Stuart Blake; Kallenberger, Richard H.; and Cvjetnicanin, George D. (2000). Film Into Video: A Guide to Merging the Technologies. Boston: Focal Press.

Rickitt, Richard. (2000). Special Effects. New York:Watson-Guptill Publications.

Salt, Barry. (1999). Film Style & Technology: History and

Analysis, 2nd edition. New York: Samuel French Trade.

Yewdall, David L. (2000). The Practical Art of Motion Picture Sound. Boston: Focal Press.

David Sedman