Motion Picture Exhibition in 1910s America

views updated

10
Motion Picture Exhibition in 1910s America

Fundamental Audience Changes
Exhibitors React
A New Theatrical World
New Style of Moviegoing
TV as the Subsequent-Run Cinema
Television Redefines U.S. Movie Watching

Douglas Gomery1

On the surface, the exhibition side of the movie industry seemed stable during the 1970s. While the major Hollywood studios struggled to redefine themselves within the TV age, the movie show seemed vanilla plain, as far from the golden days of the movie palace as one could imagine. The only change appeared to come at the popcorn stand, where the number of treats available at inflated prices seemed to increase daily. Indeed, during the 1970s the average theater saw its revenue share from concession sales rise from one-eighth to one-fifth of the total. But this seeming stasis masked considerable transformation. There was steady growth in the number of available indoor screens in commercial use, as multiplexing slowly pushed the figure from 10,000 towards 15,002.

The 1970s surely was the decade when the theatrical moviegoing experience hit its nadir in the United States. Cluster after cluster of unadorned screening rooms typically offered only previews of Hollywood's latest features, maybe an ad or two, and then the blockbuster of the week. It was as if, realizing that they had lost the battle with TV, the exhibitors gave up almost all pretense of the competition that had long defined this sector of the industry. Gone were the architectural fantasies—screen number two offered even fewer amenities than were available at home. The movie house architecture seemed to become the international style at its soulless worst. Its function, in the age of television, seemed clear: collect the money, sell the popcorn, show the blockbuster, and then repeat the cycle all over again.

But simply focusing on this sad state of affairs loses sight of the real and significant changes in the presentation and viewing of movies that were taking place. Historically, not only was there increased interest in movie culture during the 1970s, there was continued growth in attendance as maturing baby boomers sought out their cinematic favorites. These boomers lived by and large in the suburbs surrounding Americas cities, so new chains of exhibitors constructed new viewing sites in suburban shopping centers and malls. And demographics was not the only historical force pushing change. Technology redefined the movie show during the 1970s. In theaters this came through new sound systems that were far superior to any available at home; but it was technical change in the home—from new forms of television—that revolutionized movie presentation, particularly as cable TV provided seemingly endless screenings of both new and old Hollywood favorites.

Fundamental Audience Changes

To understand the basic causes transforming movie exhibition and presentation, we must began with the emergence of the new audience demographics. The baby boomers, folks born after 1945, came of age during the 1970s and generated more loyal movie fans than in any time in history. Living in the growing and expanding suburbs surrounding American cities, their interest caused more and newer venues of exhibition to appear. Their attendance kept theatrical exhibition alive and well despite many pundits' predictions that TV would kill going out to the movies. Baby boomers were altering all forms of American life, and none more so than moviegoing.3

The settling of the suburbs was one of the great postwar historical phenomena. Nearly all the new population growth in the United States after the war took place in the suburban rings around America's cities. American suburbs grew fifteen times faster than any other segment of the country. And once the movement to the suburbs began, more than one million acres of farmland were plowed under each year. In order to keep the move within the family budget, to purchase the biggest possible house, on the biggest possible lot, American families sought places where land prices were low and relatively spacious homes were affordable. Supported by Veterans Administration and Farmers Home Administration mortgages, home ownership in the United States increased by nearly 50 percent from 1945 to 1950, and went up another 50 percent in the decade after that. By 1960, for the first time in the history of the United States, more Americans owned houses than rented.4

After the Second World War, Americans accelerated a trek that they had begun at the turn of the century—the movement toward single family dwellings in the suburbs of America's cities. To appreciate the scope of this internal migration, compare it to the more famous transatlantic movement from Europe to the United States around the turn of the century. In 1907, when European emigration was at its peak, more than one million Europeans landed in the United States, precisely the magnitude of the suburban migration of the late 1940s and early 1950s. Coupled with this massive move to the suburbs was an equally historical increase in family size—the baby boom. The two-child family common since the turn of the century was joined in great numbers by families of three children or more. Somewhat cynically, economists fit children into their models as durable goods "consumed" by the parents. (Durable, versus nondurable, goods provide a stream of long-term, versus short-term, benefits. A child generates costs in terms of doctor's care, clothes, food, education, and so on. The benefits are traditionally explained in terms of increased labor value from children as workers around the house or in a family business, and in terms of increased psychic benefits to the parents, what some might call the "make-us-proud-of-you" effect.)

The baby boom of the 1940s and 1950s would constitute the core of the movie audience of the 1970s. The baby boom in the United States proved remarkable because it included all segments of the society. Bachelors and spinsters, childless couples, and couples with only one child all but vanished, and the odds of a mother with four children having a fifth or sixth also actually declined. What developed was an unprecedented concentration of families with two or three children. That is, like other consumer durable goods families went out and acquired children as fast as they could, though not in numbers larger than they could afford. With the pent-up demand, which had been put off during the Great Depression and the Second World War, nearly everyone of family-creating age acquired children.

Another stunningly reversed historical trend lay in the characteristics of the families who were having more children. Demographers have tended to conclude that modernization leads to fewer children. The baby boom should have seen the lowest participation among the urban, educated, and rich, so it was unexpected when these groups actually led the population explosion. Families with high school- and college-educated parents, upper middle class in terms of income, had larger and larger families. Lawyers, doctors, and executives contributed more proportionally to the baby boom than did the factory workers and farmers usually thought to produce the biggest families. Likewise, and equally unexpected, families in urban and suburban areas contributed more children than did rural Americans.5

Exhibitors React

The executives of the film industry were not oblivious to the economic consequences of the vast social and demographic factors reconfiguring their potential audiences. Yet the chief operating officers of the major Hollywood studios could not react because, according to the United States v. Paramount Pictures et al., the antitrust case that had been decided by the United States Supreme Court in May 1948, they had been given five years (1945–1953) to divest themselves of the movie theater side of the film business. After the Paramount case the movie industry—until the middle 1980s—divided into two parts: Hollywood firms controlled production and distribution of films, while different companies, for the first time since the days of the nickelodeon, dominated the exhibition marketplace.6

The five successor corporations that acquired the theater chains of the five major Paramount defendants—Loew's, Inc.; Paramount; RKO; 20th Century-Fox; and Warner Bros.—dealt with the changing face of the moviegoing audience as best they could. They could acquire theaters, but only after having convinced a United States District Court that such acquisitions would not unduly restrain trade. Each petition required a formal hearing, often facing stiff opposition by attorneys representing the Department of Justice. Significantly, time and again the closing of a downtown movie palace and the purchase of one in the suburbs was not considered equivalent. It was not until 1974 that the supervising judge in the United States District Court of New York agreed to let one of the surviving former Big Five theater chains go into the suburbs of a city where they had formerly held sway.

Indeed, by 1979 only Loew's, Inc. was still operated by its original successor corporation. In 1959, once Loew's had fully complied with the Court's decrees, the new corporation, which was no longer affiliated with MGM at all, operated fewer than one hundred theaters. By 1978, Loew's had sold most of its original picture palaces and had transformed itself into a small chain of multiplexes—sixty theaters with 125 screens. The Loew's corporation's principal business had become hotels, insurance, and cigarette manufacturing.7

In 1953, the former Paramount theaters were purchased by American Broadcasting Companies, Inc., ABC television and radio. In 1948, the year of the Paramount consent decrees, Paramount itself had been the largest of the Hollywood-owned circuits with nearly 1,500 theaters. But by 1957, after the specific divestitures ordered by the Court, ABC's Paramount division had just over 500. Thirteen years later the circuit numbered slightly more than 400, though the Paramount chain still generated millions of dollars in revenues and should be credited with providing the monies that kept ABC television going during the hard times of the 1950s and 1960s.8 By the 1970s ABC was going great guns as a television network, and sold its theaters to one of its former employees, Henry Plitt, in two stages. In 1974, the northern division of 123 theaters was sold to Plitt for $25 million. Four years later ABC exited the theater business entirely, selling the remainder of the theaters (its southern division) to Plitt for approximately $50 million. By then theater revenues of more than $8 million per year constituted only one-twentieth of ABC's total business.9

The successor to 20th Century-Fox's theaters, the National General Corporation, controlled nearly 550 theaters at the time of the Fox divorcement and divestiture in 1951. Six years later, by court direction, the number had nearly been halved. The company struggled along, trying to deal with the new world of film exhibition, but grew little. Thus it surprised no one that in 1973 National General, still controlling some 240 theaters, was sold to a former employee, Ted Mann, who concentrated his activities in the Los Angeles area. Ironically, in 1985 Mann sold his chain to Paramount, and two years later Paramount merged them into the Cineamerica chain it owned and operated with Warner Communication. Thus the former Fox theaters became the core of a nationwide chain that is now controlled by Paramount and Warner Bros., which had long been two of Fox's chief rivals. And ownership of exhibition venues has once again returned to major Hollywood moviemaking powers.

The fates of the final two of the successor companies—RKO and Warner Bros.—would also become intertwined. RKO Theatres had only 124 theaters at the time of its divorcement in 1948. The chain was sold twice, and when the Glen Alden Corporation surfaced as owner in 1967 only thirty-two theaters were left. Meanwhile, the former Warner Bros, theaters had become the core of the Stanley Warner Theatre chain, beginning with more than 400 theaters at the time of the 1951 consent decree. But under court directives that forced sales, the number of theaters that Stanley Warner controlled fell to 300 by 1957, about 200 by 1960, and approximately 150 by 1967. In 1967, the Glen Alden Corporation merged what remained of the two chains into RKO-Stanley Warner, but the chain did not thrive, and yet another company, Cinerama, purchased it in the 1970s.10

The break-up of the Big Five movie theater circuits provided the openings in the theater business that keyed the changes of the 1970s, with new entrepreneurs responding to suburbanization and the baby boom. The first response involved the construction of more "auto-theaters" at the edges of all U.S. cities. Though drive-in theaters had, in fact, been around in small numbers since the mid-1930s, when the necessary building materials became available after the Second World War thousands of drive-ins opened in paved-over farm fields. The development of the drive-in was a peculiarly American phenomenon generated by the huge demand for auto-convenient movie exhibition by the millions of new suburbanites from coast to coast.11 Sitting in parked cars, movie fans watched double and triple features on massive outdoor screens.

By 1952 the average attendance at drive-ins had grown to nearly four million patrons per week. One estimate had the public spending more at drive-ins that had not existed a mere decade before, than at live theater, opera, and professional and college football combined. By the early 1960s, drive-ins accounted for one out of every five movie viewers. For the first time, during one week in June of 1956, more people attended driveins than went to traditional "hard-top" theaters, thus sealing a pact initiated by early screenings in amusement parks and by traveling exhibitors, and stimulated by the airconditioned movie palaces of the 1920s. Moviegoing attendance peaked when the weather was at its most pleasant, during the summer, a trend that has continued to the present day.

But the drive-in, even with CinemaScope or Panavision, did not provide a permanent solution to serving suburban America; or even, in the long run, a viable alternative to the basic comfort of the suburban television room. If the movie palace had been the grandest of arenas in which to enjoy watching films, the family car was not. What fantasies it held, both for teens desperate to get a little unsupervised privacy, and for mom, dad, and the kids getting out of the house for a cheap night of fun, had little to do with the movie viewing experience. The lone attraction of the drive-in seemed to be that it was so inexpensive—two dollars a car for whomever could squeeze in. Pacific Theatres' slogan for its southern California customers became: "Come as you are in the family car." But price, convenience, and informality could not clear a foggy windshield or improve the sounds from the tinny loudspeaker hooked to the car window, which frequently fell off as someone climbed out to run to the restroom or buy more popcorn.

By the 1970s the drive-in had passed its heyday. The land that in the early 1950s had lain at the edge of town was becoming too valuable. Suburbanization had continued unabated and the acres required for a drive-in could be more profitably employed as space for a score of new homes. Furthermore, even drive-ins offering inexpensive triple features like Nashville Girl, Blazing Stewardesses, and Dirty Mary, Crazy Larry failed to make money, and as the receipts ebbed exhibitors sought longer term and more permanent responses to the suburbanization of America, and to the television in every household. Ironically, the sites they chose lay near the very drive-in theaters they were abandoning.

The 1970s suburban theater emerged from a radical transformation in American retailing. As the modern shopping center of the 1960s became its enclosed successor, the shopping mall, during the 1970s, the motion picture industry followed, and thereafter came the familiar multiplex theaters and the mall cinemas ubiquitous in the latter quarter of the twentieth century. By 1980, to most Americans going to the movies meant going to the mall. To the exhibition industry the movement to the mall completed the transition from the downtown-oriented, run-zone-clearance system that had divided the nation into 30 zones whose theaters were classified (and admissions accordingly scaled) as first-run, second-run, or subsequent-run, with "clearance" periods mandated between each run in order to squeeze the maximum profit out of every release.

When the first movie houses were opened and before the 1960s, downtown had dominated the retailing scene in the United States. Outlying centers grew at the intersections of transportation lines, and were accessed by public transportation and surrounded by apartment buildings, but postwar suburbia was based on the automobile and the single-family dwelling. There were very few shopping centers built in the 1920s or 1930s: Upper Darby in Philadelphia (1927), Highland Park in Dallas (1931), and River Oaks in Houston (1937) offer rare examples. The most famous shopping center of all was J. C. Nichols' Country Club Plaza, located in suburban Kansas City, Missouri, and opened in 1925. This is usually taken as the archetype for the modern, pre-planned shopping center since Country Club Plaza was auto-oriented, had extensive homogeneous landscaping, and offered acres of free parking. But the Great Depression and World War II called a halt to cloning, and as of 1945 there were only eight shopping centers in the United States.

However, the suburbanization of the late 1940s gave rise to increasing numbers of the shopping center as we know it today. The number of shopping centers grew from a few hundred in 1950 to near 3,000 in 1958 to more than 7,000 in 1963, with further extraordinary growth taking place in the late 1960s and 1970s. By 1980 the United States had 22,000 shopping centers. Even in major centers like New York City, Chicago, and San Francisco, the bulk of retail trade had moved to the edge of the city. And new regional centers emerged at the intersection of the major new highways built with funds from the 1956 Federal Highway Act. All shopping centers had acres of parking; increasingly they were enclosed from the elements; and their special services included soothing piped-in music, nurseries, band concerts, fashion shows, and, of course, movie theaters.12

The concept of the enclosed, climate-controlled shopping center or mall was first introduced at Minneapolis's Southland Shopping Center in 1956. Malls required large tracts of land and multi-million dollar financing, so it took time and planning before many more were built. The mall-building movement truly commenced in the late 1960s, and soon the Paramus Mall in Northern New Jersey; Tyson's Corner Shopping Center outside Washington, D.C.; Northridge near Milwaukee; Sun Valley in Concord, California; and Woodfield Mall in Shaumburg, Illinois, near Chicago's O'Hare airport provided the shopping hubs for their respective communities. Eventually, "super-regional" malls were developed to draw shoppers from several states, not simply the nearby metropolitan areas. These malls contained hundreds of stores, at least four department store hubs, restaurants, hotels, ice skating rinks, and, of course, multiple movie "screens."13

The idea of an enclosed place in which to shop and play became one of the defining icons of the 1970s. Whether it be the theme-park Olde Mistick Village in Mystic, Connecticut (a whole shopping center representing a New England village); plain-vanilla Towne East in Wichita, Kansas; or the self-proclaimed world's largest mall in Edmonton, Alberta, Canada, the mall is defined by its similarity, not its differences. Everywhere a Sears, J.C. Penney, or Gimbels stood alongside a Dalton or Walden Bookstore and a Limited clothing store. Malls are among the most meticulously planned structures of the late twentieth century, brightly lit to promote a safe image, enclosed to keep out the elements, convenient to the highway to make a car trip seemingly effortless. Here was one-stop shopping superior to what any aging downtown could offer. Thus, by the late 1970s Americans were reported to be spending more time in shopping malls than anywhere outside their jobs or homes.14

The look of the shopping center was a knock-off of the international style: function dictated everything; stylistic considerations were set aside. For the movie theater this meant stripping all the art deco decoration that had made theaters of the 1930s and 1940s so attractive. Little of the earlier architecture survived. Though there was a necessary marquee to announce the films, the lobby became simply a place to wait, with a multipurpose concession stand, and a men's and women's room. The auditorium was a minimalist box with a screen at one end and seats in front, and it was as if one had come into a picture palace stripped of all possible decoration. Gone were the lobby cards, posters, and stills of an earlier generation.

The Paramount case had opened up the exhibition market for new shopping mall theaters; and the shopping center offered the locus for new power and profits. From these initial openings came the new theater chains, those that dominate the final decade of the twentieth century. Consider the case of General Cinema, a company that grew to become one of the largest national chains of the 1970s. When Philip Smith, the founder of General Cinema, built his first drive-in outside Detroit in 1935, the downtown movie palace was still king. Indeed, in Detroit his competition was the most powerful of the theater chains, the Paramount Publix circuit. A drive-in was the only way he could enter the market because Paramount did not absolutely control that market niche. But after the end of the Second World War Paramount signed a consent decree in 1949, and suddenly Detroit was an open market for the movies. Smith, the pioneer drive-in exhibitor, had a step on the competition, and prospered by building drive-in after drive-in. By 1949 the chain had more than twenty sites. Smith and his associates booked first-run films, admitting children free as long as their parents paid full fare. Smith sought to attract the suburban family by emphasizing vast concession stands enclosed within self-service cafeteria-style cinder block buildings, which often also included the projection booth and the manager's office. General Cinema likes to take credit for the introduction of the folding tray for cars, the extra-large sized (and more profitable) drinking cups, the barbecue hamburger, the pizza pie craze, and the wooden stirrer for coffee. Whether they were first or not does not matter. General Cinema concentrated on the concessions and the enormous profitability of this area of the business.15

Philip Smith died in 1961, just as the drive-in industry was reaching its zenith. His son, Richard Smith, a thirty-six-year-old Harvard-educated assistant to the president, went one step farther than his father and moved General Cinema into the new suburban shopping centers that were being built across the United States. By the late 1960s, General Cinema, as the company was then named, owned nearly one hundred shopping-center theaters, the largest such collection in the United States at the time. In 1967, General Cinema, with shopping-center theaters providing more than half its revenues, earned more than $2 million in profits on more than $40 million in revenues. And little of General Cinema's growth was due to the purchase of existing downtown theaters. In 1967 it had some 150 theaters in twenty-six states. By 1970 the number of theaters topped 200, with more than 250 screens. By mid-decade the numbers had doubled, with the number of General Cinema drive-ins declining from thirty-eight in 1966 to ten in 1978.16

As its profits rose, General Cinema diversified in 1968 into the allied soft-drink business—the company had recognized the importance of this aspect of the business as it developed its concession stands, first in the drive-ins and then in the suburban shopping center theaters. General Cinema first bought a Pepsi-Cola bottler with four plants in Florida and Ohio. Smith stated at the time that the young people of the country were the best movie customers as well as the best soft-drink customers. Moreover, he noted that the two businesses operated in similar fashion; both were decentralized. By the 1980s General Cinema would be one of the largest soft drink bottlers in the United States, far better known for this segment of its operation than its theaters.17

By the 1960s many exhibitors besides Richard Smith had come to realize that the shopping center cinema was beginning to define moviegoing. During the 1960s, with pre-planned malls opening in record numbers around the United States, innovative theater chains worked with shopping center developers to jam half a dozen multiplexes of prefab, indistinguishable design into malls across the nation. With acres of free parking, and easy access by super-highway, the movies in the shopping center grew to accommodate the majority of the nation's indoor screens, and became the locus of Hollywood's attentions. No company symbolizes the growth of the multiplex more than the Kansas City-based American Multi-Cinema. As the shopping center world expanded to take over retailing in the United States, American Multi-Cinema made it a goal to offer each one a half dozen average indoor screens with a few hundred seats and one concession counter; staffed by two high-school students, one projectionist, and one manager who doubled as a ticket taker. Like the fast food operations across the nation, labor in the movie theater was reduced to low-cost, untrained servers and button pushers. With costs so low, it only required a few "boffo" box-office films a year to guarantee a profitable venture.18

Although the multi-cinema concept was not invented by American Multi-Cinema, that company took the concept to its logical extension, opening hundreds of similar operations from coast to coast. In July 1963, the predecessor company to American Multi-Cinema had opened the Parkway Twin, reputedly the nation's first twin-screen theater in a shopping center. Significantly, the Parkway—which was the first theater constructed in the Kansas City area since the late 1930s—presented first-run films. Parkway One had four hundred seats; Parkway Two had three hundred seats, but they had a common ticket booth and a single concession stand. The Parkway cost around $400,000 to build, and was watched closely from the beginning by the movie trade as a test case. It succeeded beyond all expectations.19 The Metro Plaza complex, which American Multi-Cinema opened in December 1966 in Kansas City, was acclaimed as the world's first planned four-plex. The "first" six-plex came in January 1969 in Omaha, Nebraska. The "first" eight-plex came in 1974 in the Omni International Complex in Atlanta, with a total of 1,175 seats, in sum nearly the size of the downtown picture palace—the Omni International Complex contained auditoria ranging from 100 to 200 seats; with a shared concession stand, a single box-office, and common rest rooms.20 While it is not clear that these actually were the first multiplexes of their size in the United States, they were certainly among the first.

American Multi-Cinema had emerged from Durwood Enterprises, which had been in business since 1920 (and as late as 1959 had but a dozen theaters, including a handful of drive-ins) and had found a way to prosper in the movie theater business. When company patriarch Edward Durwood died in 1960, his son, Stanley, expanded the company. In 1969 it was renamed American Multi-Cinema, a name more descriptive of its operation. As shopping centers were being built, this company in the heart of the United States took advantage of the opening. It was not a franchise system; rather it owned and operated all its screens.21 Throughout the 1960s Stanley Durwood worked with architect Stanley Staas to design and build the Empire-4, the Midland-3, and the Brywood-6, in the Kansas City area. American Multi-Cinema duplicated this experience throughout the nation and by 1972 it owned more than 160 screens in nearly thirty cities in some thirteen states. As Stanley Durwood noted: "Four theaters [later six or more] enable us to provide a variety of entertainment in one location. We can present films for children, general audience, and adults, all at the same time. " American Multi-Cinema would extend the concept so that new complexes in the late 1970s handled a dozen or more screens.22

A New Theatrical World

The corporate triumphs of General Cinema and American Multi-Cinema led them and two other companies to develop national chains of hundreds of theaters each. These were the successors to the chains of the Big Five. The difference lay not only in the size of the auditoria (about 200 seats), and the number in any location (multiplexes of up to twenty screens), but also the fact that these chains did not concentrate in single regions of the country, but were spread the length and breadth of the United States. By the early 1980s, according to Variety, four chains dominated the United States.23

First in line was General Cinema, with 1,000 screens in some 350 locations in nearly all forty-eight continental states. Few of the original drive-ins were left by 1980; this was a multiplex operation pure and simple.

Second place was occupied by United Artists Communications, with slightly less than 1,000 screens in nearly 350 locations, with one in twenty being drive-ins. United Artists Theatre Circuit, Inc. grew to become one of the largest theater circuits in the nation, ironically more powerful than the more famous Hollywood producer of the similar name. It was a leader though the 1960s and 1970s in building multi-screen theater complexes in suburban shopping centers. United Artists had begun as an adjunct of Joseph M. Schencks interest when he was the head of the United Artists' moviemaking company in the late 1920s and into the early 1930s. Although the two companies operated closely when Schenck was on top, he owned the theater chain outright and kept it separate from the moviemaking company. Thus when Schenck parted with United Artists to form 20th Century-Fox, he and his partners retained their hold on the theater chain; and United Artists Theatres represented the top movie palaces in such cities as Detroit, Chicago, and Los Angeles—and in partnership with others (including Sid Grauman) in Pittsburgh, Baltimore, Louisville, New York City, and Los Angeles with the famous Grauman's Chinese and Egyptian.

Third in size was American Multi-Cinema, with over 700 screens in 130 locations across the United States. None were drive-ins. And fourth place was held by Plitt Theatres, then based in Los Angeles, having moved from the former headquarters in Chicago, with 600 screens in nearly 300 locations. Again, none were drive-ins.

No other chain had more than 350 screens. The smaller circuits, with a couple of hundred theater screens, included the midwestern-based Commonwealth chain, the southern-based Martin Theatres, the Boston-based National Amusements, the Southern Cobb circuit, Kerasotes of Illinois, and Pacific of Southern California.

New Style of Moviegoing

What movie patrons received for their entertainment dollar with movies in the mall, save locational centrality, proved as far from the golden days of the movie palace as one could imagine. The clusters of unadorned screening rooms offered only feature films and concession stands. Space was at a premium, and screens were often sandwiched in the basement of a mall. It was as if, having realized they had lost the battle with television and the living room, the movie theaters gave up almost all pretense of the struggle at the level of architectural fantasy and the viewing experience, and actually produced interiors with less to offer than at home. Taking their cues from the dominant, Bauhausinspired trend in architecture, the new movie chains seemed to push the slogan of the international style to new lows of literalness: only function should dictate building form. The function in the age of television was clear: show blockbuster feature films and nothing else. Gone was the architectural ambience of the movie palace; any decoration on the side walls or around the screen seemed irrelevant. The movies became self-service: ushers were rarely sighted. The idea of live entertainment plus movies was something that grandmother and grandfather talked about. Only air conditioning continued to add a measure of pleasure. The mall theaters offered minimalist moviegoing.

Viewing conditions had reached an all-time low. To shoe-horn as many auditoria (rarely with more than 250 seats each) into a corner of a shopping center, projection booths rarely lined up with the screen. That is, one booth served two or more spaces, so the image invariably came out with one half of the movie larger than the other (a phenomenon called "keystoning"). To further skimp on costs, theater owners inadequately padded walls between auditoria. Thus, for example, as one tried to catch a quiet moment of Annie Hall (Woody Allen, 1977), more often than not the rousing battle sounds of Star Wars (George Lucas, 1977) poured through the wall, drowning out dialogue and distracting attention.

Ironically, part of the sound problems resulted from one of the few improvements in the viewing experience. Dolby sound systems managed to outdo the television set by eliminating all extraneous noises and placing six-foot speakers in every corner of the auditorium, and behind the screen. The sound in 200-seat auditoria became so good that it could only have been properly accommodated in the 3,000-seat movie palaces of the 1920s.24 Such systems, for a generation trained on home stereo and the portable cassette players, made sound at the movies far superior to all but the best home stereo systems. Four-inch television speakers were simply no match. Now, finally, the images of Panavision were coupled with new, clear sound, ratcheted to levels that offered the audiences of the 1980s a new, totally enveloping technological experience.

Unfortunately all this sound seemed largely to encourage television-trained viewers to talk during the screening. Television had trained movie fans at home to accept constant conversation as part of the standard viewing experience. By the 1980s, talking and constant commotion had become the norm at the movies in the mall.

Other interior amenities, once taken for granted by film fans, disappeared in the age of the multiplex. Waiting in the lobby of a movie palace that was designed to hold as many folks as could sit in the auditorium was a wonder-filled experience. In the multiplex, lines often spilled out into the mall, tangling with shoppers. What space there was in the lobby per se was invariably taken up by the popcorn stand, which in some cases also hawked T-shirts and posters, while attendants made change for the video games tucked into any corner adaptable as a "profit center." In many ways going to the movies had been reduced to the equivalent of standing in line at the K-Mart to buy a tire or pick out lawn furniture. The physical viewing conditions of the multiplex, save its advantage of superior sound, seemed a throwback to the nickelodeon era. Indeed, rarely was a live projectionist back there in the booth to deal with a film that tore apart. The social experience of vast crowds in a thousand-seat auditorium had been fragmented into screening rooms typically holding no more than 200 noisy patrons.

The suburbs and television had changed the nature of film in a number of ways, none more important than this fragmentation of viewing conditions. Indeed, drawing folks away from their television sets and living rooms required a diverse set of attractions, all designed to create the blockbuster. That is, so much money could be made on a single film, such as a Star Wars (1977) or E. T.: The Extra-Terrestrial (Steven Spielberg, 1982), that fashioning blockbusters became the single purpose of the contemporary film industry. But no one could predict which film would become a blockbuster and which not. So theater owners tended to construct one large auditorium (with about 500 seats) surrounded by a half dozen auditoria with 100 seats each. Films were tested in the smaller auditoria and only the true blockbusters moved into the more spacious venue.

The multiplex put an end to folks over thirty-five regularly going out to the movies. By the late 1970s there were too many sticky floors, too many noisy patrons, too many films that seemed alike. Yet there was a reaction. Despite all sorts of predictions of the demise of the cinema experience, during the 1980s theaters would get better. They would also grow in number, making more screens available than had existed in the United States during the peak period of the late 1940s. As a new theater circuit that was not simply taking advantage of openings provided by suburbanization, or the Paramount case, Cineplex Odeon was set to offer a better moviegoing experience.

The changes began with sound. In the 1970s, into a nearly completely optical sound-track world of theatrical exhibition, came Sensurround. For Earthquake (Mark Robson, 1974) Universal added an eight-minute sequence in which Los Angeles is destroyed, with imagery that was enhanced by adding low-frequency sound waves to the sound track during the dubbing process to produce a rumbling effect. Theater owners were offered (for five hundred dollars per week) special speakers and an amplifier to produce the effect in their theaters. Sixty units were built, and the gimmick helped the film become one of the top box-office attractions of that year. The Sensurround system won a special Academy Award and was employed again for such films as Midway (Jack Smight, 1976), Rollercoaster (James Goldstone, 1977), and Battlestar: Galactica (Richard A. Colla, 1979).

In 1977, Universal president Sidney J. Sheinberg declared: "Sensurround is as big a star as there is in the movie business today." He was wrong. The problem was that Sensurround worked best in a stand-alone theater. In multiplex situations the rumble poured through the walls into the other auditoria. Moreover the latter three films were not major box-office attractions. Thus the added expense and trouble seemed wasted, and the system was abandoned—although as of 1989 Universal still held the patents.25 Despite Sensurround's failure, however, the industry had recognized that sound was one way theater owners could compete with television, since most TV sets at that time had only tinny, four-inch speakers. Eventually, the quest for new sound quality coalesced around Dolby sound, a high-fidelity stereo sound that provided clear, lifelike reproduction of the entire musical range, and accurate reproduction of the volume range.

Dolby, which was first introduced in 1975, is a process that uses 35mm release prints containing stereo optical tracks in which the sound is recorded as variable patterns of light and shade on the film strip itself. The optical system was maintained because Dolby prints cost no more to produce, they last the needed time under the wear and tear of constant use, and the theater equipment requires little in the way of special upkeep. Dolby noise reduction is a means of electronically reducing the background noise inherent in all recording media while not disturbing the sounds one is intended to hear, and it is at the heart of the Dolby stereo process because it paved the way for improvements in the full range of sound.26

Dolby Laboratories was founded in 1965 by physicist Ray M. Dolby to develop noise reduction techniques for the music industry. Through the late 1960s, Dolby's innovations made their way into improved home tape recorders, cassette decks, and FM receivers. Indeed Dolby helped bring about high-fidelity cassettes that had been hampered by problems inherent in slow recording speeds. In the 1970s, Dolby turned to the film industry. Although Stanley Kubrick did all the pre-mixes and masters for A Clockwork Orange (1971) with Dolby noise reduction, the film was released with conventional optical mono sound. With Callan (Don Sharpe, 1974) came the first Dolby encoded mono sound track for general release. The first true Dolby stereo came with the release of the rock opera Tommy (Ken Russell) in 1975. The sound track in that multi-sensuous experience impressed the young audience, but Ken Russell was hardly

the household name to impress moguls in Hollywood. Indeed, in the first years, the films with Dolby seemed relegated to musicals: Ken Russell's Lisztomania (1975), Robert Altaian's Nashville (1976), John Badham's Saturday Night Fever (1977). But it was with George Lucas's megahit, Star Wars (1977) and Steven Spielberg's Close Encounters of the Third Kind (1977) that filmmakers took full advantage of the new recording techniques, and those films scored at the box office in part because of the improved sound. In a survey reported in July, 1977, the month after Star Wars opened, 90 percent of those surveyed claimed that Dolby sound made a difference. By 1979 there were 1,200 Dolby-equipped theaters in the United States.27

The cost to convert a theater in the late 1970s, depending on how sophisticated the owner wanted the house to be, was under ten thousand dollars. By late in 1984, Dolby could claim some 6,000 installations in forty-five countries around the world, with the bulk in the United States. About one quarter of theaters in the United States in the mid-1980s had this special advantage, principally all first-run, suburban theaters, most often in the center and biggest auditorium of a six-plex or eight-plex. In the mid-1980s nearly 90 percent of all Hollywood films were being released in Dolby, with the common four channels of left front, center, right front, and surround.28

TV as the Subsequent-Run Cinema

The 1970s were also the age when Hollywood feature films regularly ended their exhibition life on television, and so Americans saw more and more movies on TV. No film established this more as a basic principle than The Wizard of Oz (Victor Fleming, 1939) which had become a viewing staple by the 1970s. A 1950s deal between MGM and CBS made this tale of Kansas and Oz an classic. After NBC premiered Saturday Night at the Movies in September 1961 with How to Marry a Millionaire (Jean Negulesco, 1953), the neighborhood movie house was essentially dead, and the subsequent viewing of Hollywood feature films would be ever after on some form of television. By the mid-1950s it had become clear that Hollywood would not directly own and operate television stations (or networks), but would, rather, supply programing. With the coming of The Late Show in the mid-1950s and Saturday Night at the Movies, feature film showings became one of television's dominant programming forms.

Actually the movie show on television began in a minor way. In the late 1940s British film companies that had never been able to significantly break into the American theatrical market (in particular the Ealing, Rank, and Korda studios) willingly rented films to television. Monogram and Republic, long able to take only what the major Hollywood corporate powers left them, also jumped on board the television bandwagon with a vengeance. This multitude of small producers, all eager for the extra money, delivered some 4,000 titles to television before the end of 1950. Typical fare included B-class Westerns (Gene Autry and Roy Rogers from Republic, e.g.), and thrill-a-minute serials (Flash Gordon, also from Republic). But the repeated showings of this fare only served to remind longtime movie fans of the extraordinary number of treasures still resting comfortably in the vaults of MGM, Paramount, 20th Century-Fox, and Warner Bros.

To understand how and why the dominant Hollywood studios finally agreed to rent (or sell) their vast libraries of film titles to television, one must go back to May 1948, when eccentric millionaire Howard Hughes purchased control of the ailing RKO. In five years Hughes ran RKO into the ground. Debts soared past $20 million, and few new productions were approved to generate needed new revenues. By late 1953, it was clear that Hughes had to do something, and few industry observers were surprised in 1954 when he agreed to sell RKO to the General Tire and Rubber Company for $25 million. General Tire wanted the RKO back titles to present on its independent New York television station, WOR (today WWOR), along with other films it had acquired. In 1955, WOR regularly programmed a Million Dollar Movie, rerunning the same title throughout the week. Any number of later movie makers cited Million Dollar Movie's repetitive screenings as inspiration for commencing their moviemaking career. George Romero, famous for making Night of the Living Dead (1968), for example, told an interviewer that seeing a screening of The Tales of Hoffman (Michael Powell, 1951) on Million Dollar Movie made him take up a career in film.

Profit figures from movie rentals to television impressed even the most recalcitrant Hollywood movie mogul. Within the space of the following twenty-four months all the remaining major companies released their pre-1948 titles to television. (Pre-1948 titles were free from the requirement of paying residuals to performer and craft unions; post-1948 titles were not.) For the first time in the sixty-year history of film a national audience was able to watch, at their leisure, a broad cross section of the best and worst of Hollywood talkies. (Silent films were only occasionally presented, usually in the form of compilations of the comedies of Charlie Chaplin and Buster Keaton.)

From the sale or lease to television of these libraries of films, Hollywood was able to tap a significant source of pure profit. This infusion of cash came precisely at a time when Hollywood needed money to support the innovation of widesereen spectacles, and television deals followed one after the other. Columbia Pictures, which had early on entered television production, quickly aped RKO's financial bonanza. In January 1956 Columbia announced that Screen Gems would rent packages of feature films to television stations. One hundred and four films constituted the initial package, from which Columbia realized an instant profit of $5 million.

From the middle 1950s on, the pre-1948 largely black-and-white films functioned as the mainstay of innumerable "Early Shows," "Late Shows," and "Late, Late Shows." Regular screenings of re-issues were rare in theaters in the early 1950s, and a decade later found more than one hundred different films aired each week on New York City television stations, with smaller numbers in less populous cities. In particular, the owned and operated stations of CBS invested in choice titles (including many from MGM). Each film was rotated, rested for three to six months, and then repeated again. Before 1961, the three television networks only booked feature films as occasional specials, as was the case for the CBS' ratings hit The Wizard of Oz, not as regular programming.

But with color television a coming reality, the three television networks wanted to show post-1948 Hollywood features, principally those in color, in lucrative, attractive prime-time slots. This required agreements from the Hollywood craft unions, including the American Federation of Musicians, the Screen Actors Guild, the Screen Directors Guild, and the Writers Guild of America. In a precedent-setting action, the Screen Actors Guild, led by its then president, Ronald Reagan, went on strike and won guaranteed residuals for televised airings of post-1948 films. This set the stage for movie showings to become staples of prime-time television.29

The NBC network premiered the first prime-time series of recent films with Saturday Night at the Movies, in September 1961, showing How to Marry a Millionaire (1953), starring Marilyn Monroe, Betty Grable, and Lauren Bacall. Ratings were high and of the thirty-one titles shown during this initial season, fifteen were in color, and all were big-budget, post-1948 releases from 20th Century-Fox. All had their television "premiere" on Saturday Night at the Movies. NBC especially liked the color titles. RCA, pioneer in television color, owned NBC and used the network to spur sales of color television sets. CBS and ABC, seeing how their shows (CBS's Have Gun, Will Travel and Gunsmoke, ABC's Lawrence Welk) fared against Saturday Night at the Movies, quickly moved to negotiate their own "Nights at the Movies." ABC, generally a distant third in the ratings during the 1960s, moved first, with a mid-season replacement, Sunday Night at the Movies, commencing in April 1962. CBS, the long-time ratings leader in network television, remained aloof and did not set in place its own "Night at the Movies," until September 1965.30

But with CBS joining the fray at the beginning of the 1965-1966 television season, the race was on. Television screenings of recent Hollywood movies became standard practice. In 1968 nearly 40 percent of all television sets in use at the time tuned in to Alfred Hitchcock's The Birds (the theatrical release date was 1963). Bridge on the River Kwai (David Lean, 1957), which was shown in 1966, and Cat on a Hot Tin Roof (Richard Brooks, 1958), shown in 1967, achieved ratings nearly as high. Clearly, recent feature films could be shown on television to blockbuster ratings: when Gone With the Wind (Victor Fleming, 1939) was shown in two parts in early November of 1976, half the nation's television sets were tuned in to that one particular offering. By the fall of 1968, ABC, NBC, and CBS "Nights at the Movies" covered every night of the week. By the early 1970s, overlapping permitted ten separate "movie nights" each week. Throughout this period NBC remained the most committed network, in part because it wanted to use recent Hollywood features to help stimulate demand for RCA's color television sets.31

The success of the movie showings on the networks significantly affected affiliated stations. The number of "Late" and "Early" shows fell by 25 percent. Independent stations not affiliated with one of the three television networks continued to rely on pre-1948 features. Indeed, films on independent channels accounted for one-quarter of their schedules. With rediscovered hits like Warner Bros.' Casablanca (Michael Curtiz, 1943) and RKO's King Kong (Merian C. Cooper, 1933) spaced judiciously throughout the viewing year, regular screenings of movies on television drew large audiences, but routine B-class thrillers and wacky low-budget war musicals spent their drawing power after one or two screenings. This unprecedented wave of movie programming quickly depleted the stock of attractive features that had not played on television. On the network level, the rule was to run a post-1948 feature twice ("premiere" and "rerun"), and then release it into syndication so that it then could be used by local stations for their "Late" or "Early" shows. Network executives searched for ways to maximize the audiences for repeated showings of blockbuster films.32 It soon became clear to all who paid close attention that there were "too many" scheduled movie showings on television, and that there was "too little" new product coming into the pipeline to fill future slots with new theatrical films. Hollywood knew this, and the studios began to charge higher and higher prices for television screenings. Million-dollar price tags became common-place, first for films that had done well at the box office, and then for those that might have not done so well but had won some sort of an award, in particular an Academy Award. For the widely heralded September 1966 telecast of The Bridge on the River Kwai, the Ford Motor Company put up nearly $2 million as the sponsor. When the film attracted some sixty million viewers against formidable competition, Hollywood insiders speculated that $10-million price tags would appear shortly.

TV network executives found a solution: make movies aimed for a television premiere. The networks could monitor production costs and guarantee fixed future rentals in the $300,000 to $500,000 range. Moreover, the networks could use these made-for-television movies to test new shows which might then be "downsized" to appear as regular series. The networks were used to paying the complete cost of pilot programs, so it was not a huge step to fashion them into stand-alone made-for-television films. Experiments began as early as 1964 when, in October, NBC aired See How They Run (David Lowell Rich, 1964), starring John Forsythe, Senta Berger, Jane Wyatt, Franchot Tone, Leslie Nielson, and George Kennedy. Labeled "Project 120," in honor of its length in minutes, the experiment proved a modest success. The next entry, The Hanged Man (1964) came six weeks later. The idea was to do an anthology series to help NBC overtake CBS in the ratings war.33

Early in 1966, NBC contracted with Universal studios to create a regular series of "World Premiere" made-for-television movies. NBC specified that all films had to be in color, again to reinforce its leadership in that area. The agreement with Universal dictated that once the TV movie was shown twice on the network, rights reverted back to Universal which could release it to theaters in the United States (a rare occurrence), then to foreign theaters (more common), and finally to American television stations for their "Early" and "Late" shows (also common).34 The initial entry for this continuing effort was Fame is the Name of the Game (Stuart Rosenberg, 1966), starring minor luminaries Jill St. John and Tony Franciosa, which was presented on a Saturday night in November 1966.

Made-for-television motion pictures took only five years to become a mainstay genre of American network television programming. By early in the 1970s, movies made for television outnumbered films that had been made for theatrical release on network "Nights at the Movies." After NBC led the way, ABC, seeing a successful trend, followed close behind. CBS, again smug with years of constantly leading the ratings with traditional series, eventually joined in. Profits proved substantial. A typical movie made for television cost three-quarters of a million dollars, far less than what Hollywood was demanding for rental of its recent blockbusters. And the ratings were phenomenal. Few expected that millions upon millions would tune in for The Waltons' Thanksgiving Story (1973), The Night Stalker (John Llewellyn Moxey, 1971), A Case of Rape (Boris Sagal, 1974), and Women in Chains (Bernard Kowalski, 1972). Such fare regularly outdrew what were considered the biggest films of the era, including The Graduate (Mike Nichols, 1967; 1973 premiere on network television), West Side Story (Robert Wise, 1961; 1972 premiere on network television), and Goldfinger (Guy Hamilton, 1964; 1972 premiere on network television). With the help of the made-for-television movie, network executives moved their "Nights at the Movies" to a full quarter share of all prime-time programming.35

One film in particular signaled that the made-for-television movie had come of age. The ABC Movie of the Week had premiered in the fall of 1969, sponsored by Barry Diller, then head of prime-time programming at ABC, later CEO of Paramount (1974-1984) and Fox (1984-1992), where he founded the Fox network. During the 1971-1972 television season, ABC's Movie of the Week series, which programmed only movies made for television, finished as the fifth-highest series of the year. On November 30, 1971, ABC presented a little-publicized made-for-TV movie entitled Brian's Song (Buzz Kulik, 1971), about a football player who dies of cancer. One-third of the households in the county watched, and half the people watching television that Tuesday night selected that movie over the fare offered on CBS and NBC. In its first five years, the ABC series accounted for four of the top twenty-five made-for-television movies for the period 1965 through 1980.36

Brian's Song vaulted to tenth place in all-time movie screenings on television. With The Wizard of Oz accounting for five of the top nine ratings up to that November night, Brian's Song joined The Birds (1963), Bridge on the River Kwai (1957), and Ben-Hur (William Wyler, 1959) in that elite grouping. It demonstrated that movies made for television could win Emmys (five), the prestigious George Foster Peabody award, and citations from the NAACP and the American Cancer Society. When then President Richard M. Nixon declared Brian's Song one of his all-time favorite films, ABC reaped an unexpected publicity bonanza.

The movie manifested nothing particularly special from other typical early ABC movies made for television. Producer Paul Junger Witt had worked with ABC before with his Partridge Family series; and Buzz Kulik was a seasoned director of television series fare. But Brian's Song proved that success on the small screen could even be translated to theatrical features. Billy Dee Williams, who played football star Gale Sayers, had been kicking around Hollywood for years with little success. Brian's Song projected him into major roles in Lady Sings the Blues (Sidney J. Furie, 1972), and Mahogany (Berry Gordy, 1975). James Caan, as the other leading figure in Brian's Song, moved on to stardom in The Godfather (Francis Ford Coppola, 1972).37 Brian's Song cost less than a half million dollars to make because stock footage from National Football League games kept production expenses at a minimum and shooting lasted only two weeks. But the impact of the first run of Brian's Song nearly equaled the publicity bonanza associated with a successful feature film. Books about the film's hero became best-sellers. The TV movie's music moved onto Billboard's charts. The success of Brian's Song signaled that the networks should plan to react to unexpected hits, preparing publicity campaigns to take advantage of twists and turns in public opinion, even to shape it as only theatrical films had done in the past.38

By the late 1970s, the made-for-television motion picture had become a staple. One study found that for the three networks during the 1979—1980 television season, there were some 430 runs of movies of which 40 percent were telecasts of theatrical fare and 60 percent were made-for-television films. The three television networks booked just about the same number of theatrical features to show (about sixty), but CBS and NBC scheduled 50 percent more made-for-television films than rival ABC.39 Furthermore, made-for-television movies made it possible to deal with topical or controversial material not deemed appropriate for regularly scheduled network series, particularly ones that could go into syndication, and as "evergreens" should not be dated in any way. And serious and celebrated actors and actresses who did not wish to work in series television could be featured in TV movies.

Another major change in movie-type programming came in the mid-1970s with the rise of the miniseries. Running over different nights, Roots (David Greene, et al., 1977), which is still considered the progenitor of the form, attracted nearly two-thirds of Americans when it was shown. Indeed, it was Roots that in a single week in January, 1977 drew an estimated 130 million households to tune in to at least one episode during the eight consecutive nights. Eighty million watched the final episode of this docudrama, breaking the audience record held by Gone with the Wind (1939). From a variety of follow-up studies it seems that black viewing was higher than that of whites. Roots was a controversial show that provoked discussion, even rare interracial discussion, as some studies found. Television had discovered an event of its own, one that was the equal of a blockbuster theatrical film.40

The origins of the miniseries went back to 1971 when NBC invested more than $2 million in Vanished (Buzz Kulik), a four-hour made-for-television movie planned for broadcast on two nights in March 1971. The ratings breakthrough came with NBC's The Blue Knight (Robert Butler), which was broadcast in four one-hour segments between November 13 and 16, 1973. Starring William Holden, this tale of an aging policeman earned high ratings and critical acclaim, proving that miniseries could make a difference in crucial rating sweeps months. Miniseries typically commence on the night of the highest viewing, Sunday. If the series goes more than three nights, it invariably skips certain evenings, those with the top regular shows of that network.

Television Redefines U.S. Movie Watching

The coming of the movies to television has meant more than simply telecasting the best and worst of (sound) motion pictures from Hollywood's past and repeated presentations of TV movies. Television has changed the way Americans consume films. For example, in the 1970s a duo of reviewers from Chicago proved that TV would thereafter be the site of a new type of reviewing—TV style. Contrasting like Laurel and Hardy, Gene Siskel and Roger Ebert pioneered short, pithy weekly critiques of the latest in cinematic fare, reaching more potential movie viewers than any of the writers who plied their trade at a newspaper or magazine.

In terms of the number of screenings, the movie fan of the latter third of the twentieth century had never had it so good. Indeed movie showings in the 1970s on television seemed an endless stream, but increased viewing hardly represented the lone transformation that television imposed on the movies. The reliance on television for the presentation of motion pictures has extracted a high price in terms of viewing conditions. The television image is proportioned four by three while the standard motion picture image is much wider. To accommodate the changed proportions, the widescreen film is cut off at the sides to fit it onto the smaller video screen. Panning and scanning re-edit the widescreen film so the "action" shifts to the center of the frame, and the changes can be profound. For example, John Boorman's Point Blank (1967) employs cramped compositions with characters on the screen one moment and then off the next. On television Point Blank becomes a jumbled, confusing mess, because the widescreen compositions are fractured to place the action at the center of the screen.

Of course, films do not need to be panned and scanned. One can reduce the image for television until all of it fits on-screen; and in practice this technique of "letterboxing" fills the empty space above and below with a black matte. In the 1980s, there has been a great deal of lip service paid to letterboxing, but movie watchers en masse do not seem to care for it. Indeed, back in the 1960s as Hollywood studios tested their film to video transfers, letterbox prints were made to check the total quality of a widescreen movie's negative or master positive before it was transferred to video. These were element tests and the home viewer never saw them. Instead the studio filled the television frame with the center of the film and then panned and scanned to capture "all" the action. Today technicians can hide the panning and scanning, making them look "natural" in the course of the film, but these "additions" were never part of the original viewing experience.41

The biggest complaint from the average television viewer of motion pictures has long been against the interruption of the movie by advertisements. To fit the formula of commercial television a station allocates 25 percent of its prime-time slots to commercials. This means that in a ninety-minute slot the movie had to be trimmed to seventy-eight minutes; for a two-hour slot the film could not run more than ninety minutes. Cutting to fit the allotted time has been commonplace since the early 1950s. Stories of how television stations accomplished this heinous task are legendary. It is said that Fred Silverman, when he was a lowly film editor at WGN in Chicago, fit the ninety-six minute Jailhouse Rock (Richard Thorpe, 1957) into a seventy-eight minute afternoon movie block by cutting all of Elvis's musical numbers!42

Since over-the-air television could not rid itself of advertisements and time slots, a market arose for uninterrupted screenings of features on television. Enter cable television. Over-the-air television served as the principal second-run showcase for Hollywood films into the 1970s, but the number of TV stations in any one market was limited. Most of the nation had but three channels, which served up only network fare. The bigger cities did have independent stations that could counter-program, often with movies, but nowhere could the movie fan see his or her favorites as they had seen them in theaters. The emergence of cable television, principally through pay channels, took advantage of the frustrations of millions of Americans who watched most of their movies on television. But it took Time, Inc.'s subsidiary, Home Box Office (HBO), to innovate a profitable strategy during the 1970s. In retrospect HBO's success is not surprising. In one survey taken twenty years ago in the days before cable television became widespread, sample respondents were asked what they most disliked about film showings on TV networks. There were only two significant answers: constant advertisement interruptions and the long wait for blockbusters to appear. HBO solved both these problems—and more.43

HBO began as a microwave service in 1972 but it was not until 1975, when HBO went to satellite distribution, that it sparked the interest in cable television. In one of the most productive investments in television history, even before the satellite had been launched, Time gambled $7.5 million on a five-year lease to put HBO on RCA's satellite, Satcom I. HBO commenced national satellite distribution on September 30, 1975, and from a base of three hundred thousand subscribers moved, within five years, to six million. By giving its subscribers uncut, uninterrupted movies a few months after they had disappeared from theaters, growth during the late 1970s and into the early 1980s proved nothing less than spectacular. By 1983 the company could claim twelve million subscribers. Indeed, Time, Inc. proved so successful with its cable operations (read: HBO) that the video arm of the company surpassed the publishing activities in terms of profits generated.44

In 1976, Viacom International, a major television program supplier and owner of cable systems, created a rival to HBO with Showtime, which went to satellite distribution in 1979. In 1979, Warner Cable joined with its then partner American Express to create The Movie Channel. In 1980, Time created Cinemax as an all-movie complement to HBO, to appeal to younger audiences, in particular Yuppies. To further differentiate its product, during the 1980s Cinemax regularly scheduled more films than the competition, an average of eighty-five per month. The Movie Channel followed with an average of seventy-eight per month; Showtime had fifty-five, and HBO only fifty. But this was understandable since HBO and Showtime had contracted more of the blockbuster, star-laden titles, which the two repeated regularly.45

The Hollywood studios tried to establish their own pay-cable movie channel, Premiere. The major Hollywood studios argued that as HBO prospered through the 1970s it did so from Hollywood movies. The studios claimed that they received insufficient income from the showings on HBO and Showtime because those two, and those two alone, controlled the marketplace. To gain the upper hand, in April 1980 Columbia Pictures, MCA/Universal, Paramount, and 20th Century—Fox announced a joint deal with the Getty Oil Company to create Premiere as their own pay-television movie channel. They planned to withhold their films and only after they had played on Premiere would they be released for screening on HBO.46 In response, HBO and Showtime asserted that this constituted a violation of antitrust law and the United States Department of Justice, near the end of the administration of President Jimmy Carter, agreed and filed suit. There was screaming and shouting, but in the end the four Hollywood companies backed off and Premiere never went into business. Ironically, had they tried a few years later, the administration of Ronald Reagan probably would have given the go ahead.47

In the format used by HBO and Showtime, during the course of a month, movies are scheduled from four to six times on different days and at different times during the daily schedule. The idea is to give the viewer several opportunities to watch each film, but not to quickly exhaust the movies that the pay service has to offer. Thus, the success of a pay-cable movie channel has been determined not by ratings for a single program, but by the general appeal and satisfaction level for the month as a whole. This was not tested by a rating system, but by whether the customers kept on writing their monthly checks.48

With all the changes and programming variations in the pay-per-view marketplace, the big winners were the Hollywood studios. Precise figures varied from deal to deal, but reliable estimates suggested that the major studios in the late 1970s gained from $5 to $7 million dollars per film from a deal with HBO/Cinemax, or Showtime/The Movie Channel. For a single year, this meant each of the six major studios stood to gain on average an extra $100 million. But movie viewers gained as well. For twenty years over-the-air television had brought the best and worst of Hollywood into the home, but with significant disadvantages: constant interruptions for advertisements, sanitization of the movie stories to please television's moralist critics, and waits of several years for hits to appear. Pay-cable movie channels jettisoned the advertisements, and ran the films intact only months after they had disappeared from movie theaters. (For theatrical failures, the "clearance time" could be a matter of weeks.)49

HBO and its pay-cable competition were not the only significant additions to the second-run cinema at home. Cable TV's superstations enabled rural and small-town cable viewers to gain access to nationally distributed independent stations with dozens of movies shown each week. Led by Ted Turner's WTBS, channel 17 from Atlanta, superstations offered movies for approximately one-half of their broadcast day. WTBS became so successful that by the middle 1980s Turner purchased MGM just to gain access to its considerable movie library, feeding the superstation's forty movie screenings per week.

Remarkably, despite all the new venues of home movie viewing, revenues at movie houses increased steadily throughout the 1970s. Indeed, as the decade ended Hollywood earned a record gross from the domestic box office. The baby boomers were not only attending blockbusters in their full theatrical glory, but were also watching more and more films at home. Thus, in totality, movie presentation was never more active or healthy. And from this base—led by Cineplex Odeon—a revived theatrical experience would commence in the 1980s, and with the innovation of home video even more film viewing would take place at home. Surely, in historical perspective, the 1970s were a pre-staging for a renewal of movie exhibition that would make the "Golden Age" of Hollywood pale in comparison.