Electronics, The Internet, and Entertainment Media
Chapter 5: Electronics, The Internet, and Entertainment Media
For many Americans, new technologies simply mean new toys. Almost every advancement in consumer technology since the 1980s has in some way been tied to entertainment. In Statistical Abstract of the United States: 2008 (2007, http://www.census.gov/prod/2007pubs/08abstract/infocomm.pdf), the U.S. Census Bureau discusses the amount of money Americans spent between 2000 and 2005 for media content, which included pay-television subscriptions, video games, home video, and music. On average, most Americans spent $787.44 on all media in 2005, which was $177.09 more than they spent in 2000, a 29% increase. This price tag does not seem unreasonable considering how much time Americans devoted to listening to music or immersing themselves in the virtual worlds of television and video games. In 2005 the average American aged twelve and older watched 1,659 hours of television, up 10% from 2000 (1,502 hours). During this time, however, broadcast television saw a 16% decline in viewership, from 812 hours in 2000 to 679; at the same time, cable and satellite television services gained viewership from 690 in 2000 to 980 in 2005, a 42% increase. This trend was projected to continue through 2010. Americans aged twelve and over listened to an average 805 hours of radio programming and 189 hours of recorded music in 2005; they also spent an average of 73 hours playing video games.
Even though television was still the media outlet of choice in 2008, Americans have been rapidly turning to new forms of entertainment made available by the Internet and other technologies. According to the Pew Internet & American Life Project (Pew/Internet), in “Internet Activities” (February 15, 2008, http://www.pewinternet.org/trends/Internet_Activities_2.15.08.htm), as of 2007, 62% of American adults who used the Internet had surfed Web sites for fun; 56% had watched a video clip or listened to an audio clip online; 48% had watched a video on a video-sharing site such as YouTube; 42% had downloaded files containing games, videos, or pictures; and 37% had downloaded music files.
Once considered the pastime of children and socially challenged adults, video and computer games now represent a major form of entertainment in the United States. Video games, also known as console and arcade games, are played using a computer that is specifically designed to play games. By contrast, computer games are just one type of program that can be run on standard personal computers. The difference between the two types of games is in how they are accessed, not necessarily in their content. Many games can be played using either a video game system or a computer. Thus, the terms video game and computer game are sometimes used interchangeably. According to the Census Bureau, in Statistical Abstract of the United States: 2008, the amount of time Americans over age twelve who played video (and computer) games rose from sixty-five hours in 2000 to seventy-three hours in 2008, an increase of 12%. The average person in 2005 spent $32.23 on these games, compared to $28.01 in 2000—an increase of 15%.
The largest retail launch in the history of the entertainment industry as of 2008 did not center around a movie, album release, or television series. It took place on April 29, 2008, and belonged to the video game Grand Theft Auto IV (GTA4 ), developed by Rockstar North for the Microsoft Xbox 360 and Sony PlayStation 3 consoles. Before GTA4, the biggest retail launch in history had been the September 25, 2007, launch of another video game, Halo 3, developed by Bungie Studios for the Microsoft Xbox 360 console. In the press release “Take-Two Interactive Announces Rockstar Games' Grand Theft Auto IV Breaks Entertainment Launch Records” (May 7, 2008, http://ir.take2games.com/ReleaseDetail.cfm?ReleaseID=308689), Take-Two Interactive Software states that on its first day of release, GTA4 sold 3.6 million copies, reaching $310 million in sales; within one week more than 6 million copies were sold globally, with a retail value of more than $500 million. For comparison purposes, the article “‘Dark Knight’ Sets One-Day Box Office Record, Passing ‘Spider-Man 3’” (Associated Press, July 19, 2008) reports that the leading one-day box office sales record, set by the movie The Dark Knight on July 18, 2008, is $66.4 million.
Rise of Video and Computer Games
Computer and video games are almost as old as computers. Many credit Alexander Shafto Douglas (1921–) with creating the first graphics-based computer game at Cambridge University in England in 1952. Part of his doctoral research on human-computer interaction, the game was played on an enormous Electronic Delay Storage Automatic Calculator (EDSAC) computer, which was one of the first computers in existence and was made primarily from rows and rows of vacuum tubes. The EDSAC display screen was a thirty-five-by-sixteen array of monochromatic dots. The name of Douglas's game was OXO or Noughts and Crosses, a human-versus-machine version of tic-tac-toe in which the human player chose the first square. Ten years later, computer games were developed on mainframe computers and eventually on ARPANET, the nationwide network of military defense computers that preceded the Internet.
One of the more popular games that spread to computer mainframes across the United States during the 1960s was Spacewar! Spacewar!, created in 1961 by Steve Russell (1937–), Martin Graetz (1935–), and Wayne Wiitanen at the Massachusetts Institute of Technology to test the capabilities of the $120,000 Digital Equipment Corporation PDP-1 computer. The game consisted of two low-resolution ships, one shaped like a needle and the other like a wedge, flying around a dot that represented a sun in the middle of the screen. The object was to destroy the other player's ship while maneuvering through the sun's gravitational pull.
In 1971 Nutting Associates released Computer Space, the first video game for the general public. Computer Space, a direct imitator of Spacewar!, was set in a futuristic arcade-style cabinet. Most people considered the game too complicated at the time, so Nutting only made fifteen hundred units and then stopped production. The next year, however, Atari released Pong. In this monochromatic game, a small cube was bounced back and forth between two slightly larger rods controlled by the player(s). The video game was a smash hit, and Atari sold more than eight hundred thousand arcade cabinets. A month earlier, Magnavox released the Odyssey, which was the first home console video game system that ran on a television set. The Odyssey, which sold for $100 ($523 in 2008 dollars), had several different games installed on it, all of which involved hitting a pixilated square (or squares) on the screen with rectangles. According to the Atari Museum (February 9, 2008, http://www.atarimuseum.com/videogames/dedicated/homepong.html), the Atari home version of Pong was released in 1975 and sold over 150,000 units during the holiday season alone.
Golden Age of Video Games
Within a year after these initial offerings, video games quickly gained a foothold in the United States. A steady stream of fairly unremarkable cabinet games was released throughout the 1970s. For most of the decade, video games were novelties that sat next to pinball machines in bowling alleys, bars, and roller-skating rinks. With the arrival of Asteroids and Space Invaders in 1978, arcade video games came into their own. Space Invaders, a game in which the player shot row after row of advancing aliens, triggered a nationwide coin shortage in Japan so severe the Japanese government had to more than double yen production. Namco introduced the first color game in 1979 with the arrival of Galaxian, and then in 1980 the company released Pac-Man. The original name of the game was Puckman, derived from the Japanese pakupaku, which means “flapping open and closed” (e.g., the character's mouth). Despite the game's simple concept of guiding a yellow, dot-eating ball around a maze, over a hundred thousand arcade units were sold in the United States. The game inspired an entire line of merchandise from lunch boxes to stuffed toys. Between 1980 and 1983 many colorful, engaging video games were released, including Defender, Donkey Kong, Centipede, Frogger, and Ms. Pac-Man, which still holds the record for the most arcade games sold at 115,000, according to William Hunter, in “Player 2 Stage 4: Two Superstars” (2006, http://www.emuunlim.com/doteaters/play2sta4.htm). Video arcades sprang up in every mall and town in the United States. On January 18, 1982, the cover of Time magazine read: GRONK!FLASH!ZAP!VIDEO GAMES ARE BLITZING THE WORLD. The cover story, “Games That Play People” by John Skow, revealed that in 1981 nearly $5 billion in quarters was spent playing arcade games. By comparison, the U.S. film industry took in $2.8 billion that year.
At the same time, game consoles were gaining popularity in living rooms across the United States. In 1977 Atari launched the Atari VCS (later named the Atari 2600) for $250 ($902 in 2008 dollars). By Christmas 1979 sales were brisk as people realized that the system could support more than just Pong. With the release of Space Invaders on the system the following year, units flew off the shelves at $150 a piece ($503 in 2008 dollars). Tekla E. Perry and Paul Wallich explain in “Design Case History: The Atari Video Computer System” (IEEE Spectrum, March 1983) that Atari sold over twelve million consoles between 1977 and 1983. More than two hundred games were made for the system. Other video systems such as Intellivision and Colecovision gained huge followings as well. Skow noted that six hundred thousand Intellivision units were sold in 1981. Overall, 1981 sales for home video games exceeded $1 billion.
Video Game Industry Stumbles
By 1984 the Commodore 64 home computer had debuted at $1,000 ($2,105 in 2008 dollars), and the Apple IIc was introduced at the comparatively affordable price of $1,300 ($2,737 in 2008 dollars). Such computers not only offered better graphics than the contemporary video game consoles but also they were useful for practical applications such as spreadsheets and word processors. Consequently, people began to lose interest in video games and buy home computers instead. In 1983, faced with a collapsing video game market, losses of hundreds of millions of dollars, and far too much inventory, Atari loaded fourteen tractor-trailer trucks with thousands of unsold cartridges and pieces of hardware. It drove the surplus out to a landfill site in Alamogordo, New Mexico, and buried the inventory in a concrete bunker under the desert. The following year Warner Communications, the owner of Atari, sold the game and computer divisions of Atari to Jack Tramiel (1928–), the founder of Commodore. Mattel, the maker of Intellivision, also shed its electronics division, and hundreds of arcades closed as well.
For several years gaming was relegated to the computer. Before 1983 computer games were low on graphics and heavy on text, but by 1984 a number of colorful and entertaining games became available for home computers, including the Ultima and King's Quest series. However, toy and electronics manufacturers in the United States were wary of investing in video game consoles after the Atari disaster.
Japanese companies were not nearly as pessimistic and continued to invest money into video console development. Nintendo, a company that originally manufactured Japanese playing cards, surprised the entire gaming market in 1986 when it released the Nintendo Entertainment System (NES). The games looked better than most arcade games from the early 1980s and took as long to play through as computer games. After two years on the market, the NES found its way into almost as many homes as the Atari 2600. According to Nintendo Land (2004, http://www.nintendoland.com/home2.htm?history/hist3.htm), the sales of NES video games in 1988 once again reached $1.1 billion. Arcades at the time also enjoyed a brief revival with the advent of complex fighting games such as Mortal Kombat and Street Fighter II. Mortal Kombat, which eventually made it onto the NES, inspired a congressional investigation into violence in video games and led to the establishment in 1994 of the Entertainment Software Rating Board, an industry self-regulatory organization that monitors the content of video games for depictions of violence, nudity, profanity, and other material that parents might find objectionable for young children.
Present and Future of Gaming
Since the late 1980s the U.S. electronic gaming market has continued its rise, with the majority of the gaming industry's revenues coming from console systems and games. After the NES ran its course with estimated sales of sixty million units worldwide, the Sega Genesis video game system enjoyed a period of popularity. The electronics giant Sony entered the fray in 1995 when it released PlayStation in the United States. Nintendo answered Sony's challenge with Nintendo 64 in 1996, which sold 1.7 million units in the United States in the first three months, according to Michael Miller, in “A History of Home Video Game Consoles” (April 1, 2005, http://www.samspublishing.com/articles/article.asp?p=378141&seqNum=6&rl=1). In 2000 Sony PlayStation launched the PlayStation 2. Taking note of the profits brought in by successful gaming systems, Microsoft, the largest software firm in the United States, launched the Xbox in 2001. Both Sony and Microsoft funded enormous advertising campaigns to promote their systems, and Microsoft sold its system for a loss to introduce it into more homes. Peter Lewis states in “Should You Wait for the PS3?” (Fortune, November 22, 2005) that over ninety-six million PlayStation 2 consoles and twenty-five million Xbox systems had sold worldwide by the end of 2005. With the release of the Xbox 360 in 2005 and the PlayStation 3 and Nintendo Wii in 2006, the video game market enjoyed double-digit year-over-year growth. The marketing research firm NPD Group reports in the press release “2007 U.S. Video Game and PC Game Sales Exceed $18.8 Billion Marking Third Consecutive Year of Record-Breaking Sales” (January 31, 2008, http://www.npd.com/press/releases/press_080131b.html) that the video and computer gaming industry sold $18.8 billion in merchandise in 2007, up 40% from $13.5 billion in 2006. The huge year-overyear growth was led by console hardware, which with products such as the Nintendo Wii retailing for $249, reached sales of $5.1 billion in 2007, a 73% increase over 2006 ($2.9 billion). Software for computer games accounted for $911 million in 2007.
In contrast, the computer game market has grown at a slower pace than the video game market. In the 1990s with the advent of Microsoft's Windows operating system, the computer game market split in two. Solitaire and countless other card and puzzle games found their way onto the desktops of every personal computer and provided a brief escape from work or schoolwork. At the same time, computers also became the platform for cutting-edge strategy and shooting games, which are generally played by a relatively small, devoted computer gaming audience. Graphics-intensive games such as Quake and Half Life 2 led to increased sales in computer components as gamers bought extra memory and bigger hard drives to boost computer performance to handle advanced graphic engines.
In the mid-1990s computer games began to go online. Hardcore fans of card games and battle and quest adventures found competitors on the Internet. The next generation of console systems also enabled gamers to go online and play against one another. In “The Future of Online Gaming” (PC Magazine, March 27, 2003), Cade Metz notes that by 2002 online gaming traffic made up nearly 9% of the overall traffic along the Internet backbone in the United States. The fastest growing segment of online gaming appeared to be in the console game market. Xbox Live, an online service for the Xbox, gained 350,000 subscriptions at the beginning of 2003. By 2008 Microsoft announced in the press release “Xbox 360 First to Reach Ten Million Console Sales in U.S.” (May 14, 2008, http://www.xbox.com/en-US/community/news/2008/0514-10million.htm) that with ten million Xbox 360 consoles sold in the United States and nineteen million sold globally, membership in the Xbox Live community topped twelve million.
Gaming Violence and Addiction
Over the years games have grown exceedingly more complex and engaging. The Sims series by Electronic Arts Inc. has provided gamers with a “real-life” fantasy world where they can simulate alternate lives. Game series such as Doom and Grand Theft Auto allow people to take out their aggressions on virtual demons or rival gang members. Massively multiplayer online role-playing games (in which a large number of players interact with each other in a virtual world that continues even when a player is offline), such as World of Warcraft, open up entire fantasy worlds where players are free to roam and embark on quests with other gamers.
As the complexity of games has grown, so, too, has the temptation for many to play video games in excess to escape their problems. Though still largely an unstudied phenomenon, gaming addiction appears to be more and more commonplace. In “Video Game Addiction: Is It Real?” (April 2, 2007, http://www.harrisinteractive.com/news/allnewsbydate.asp?NewsID=1196), Harris Interactive finds that about 23% of gamers aged eight to eighteen reported in a January 2007 poll that they had felt addicted to video games at some time, and 8.5% suffered enough negative consequences from excessive play to be classified as addicted gamers. Those classified as addicted in the Harris poll played an average 24.5 hours per week, compared to 13 to 14 hours per week on average for all youth aged eight to eighteen. About two-thirds (65%) of addicted gamers had video game systems in their bedrooms. Harris Interactive notes that as a group they were more likely to have been diagnosed as having an attention deficit problem and were more likely to be receiving lower grades in school than those who played games less.
Many psychologists believe games provide a means of escape for people with stressful lives or mental problems in much the same way as drugs and alcohol. A number of symptoms that accompany gaming addition are similar to those of other impulse control disorders, including alcoholism and drug abuse. These include preoccupation with gaming life over real-life events, failed attempts to stem gaming behavior, having a sense of well-being while playing games, craving more game time as well as feeling irritable when not playing, neglecting family and friends, lying about the amount of time spent gaming, and denying the adverse effects of too much gaming. In June 2007 the Council on Science and Public Health urged the American Medical Association to classify excessive gaming an addiction. Lindsey Tanner reports in “Is Video-Game Addiction a Mental Disorder?” (Associated Press, June 22, 2007) that advocates of this classification hope the move will bring attention to the plight of those whose lives are affected by excessive gaming and will enable affected families to gain insurance coverage for treatment.
In “Evidence for Striatal Dopamine Release during a Video Game” (Nature, vol. 393, no. 6682, May 21, 1998), M. J. Koepp et al. find that video games can cause secretions of the neurotransmitter dopamine in the brain. Neurotransmitters are chemicals that relay signals between brain cells. Dopamine is one of dozens of these chemicals and has been known to induce pleasure in the brain to reinforce behavior. Previous studies show that dopamine levels in hungry rats increased when they received food. In this case the chemical reinforced the pleasure the hungry rat felt on eating, so the next time the rat was hungry, it would be doubly motivated to eat. Dopamine levels have also been associated with pleasure-producing drugs. Koepp et al. report injecting human volunteers with a chemical that reacted to the brain's secretion of dopamine. The scientists then used positron emission tomography (PET; an X-ray–like technique) to monitor the levels of dopamine in the brain as the volunteers steered a computer game tank through enemy territory picking up flags. The participants were awarded $10 for each level completed. The PET showed that dopamine levels shot up in the volunteers each time they finished blasting through a level. Even though Koepp et al. state that they were simply testing the technique of tracking dopamine and not gaming addiction, the possibility exists that goal-oriented games elicit a pleasurable chemical response in the brain. Memory of such pleasure could cause addicted gamers to come back for more.
Another problem people have with video games is violence. Parents and teachers have always been concerned that violent games may lead to violent aggressive behavior. Fears were fueled in 1999 by the shootings at Columbine High School in Littleton, Colorado, where two teenage students killed fifteen people. In their suicide note, the teenage murderers said they drew inspiration from the video game Doom. In Grand Theft of Innocence? Teens and Video Games (September 16, 2003, http://www.gallup.com/poll/9253/Grand-Theft-Innocence-Teens-Video-Games.aspx), Steve Crabtree of the Gallup Organization indicates that many parents and educators are concerned about the violence in video games such as Grand Theft Auto. Crabtree states that the concern is perhaps justified. Nearly three-quarters (74%) of teens played video games at least one hour per week in 2003, and 60% of teens had at some point played a game in the Grand Theft Auto series. Only sports games were played more. (See Figure 5.1.) Not only do such violent games give teens a false impression of adult life but also studies show that the games may hinder social development in some teens. According to Crabtree, a 2001 study at Tokyo University revealed that violent games stunt the development of the brain's frontal lobe, which is the part of the brain that controls antisocial behavior.
There is no doubt inside or outside the scientific community that gambling can be addictive. One troubling development on the Internet in the early 2000s was the continued rise in online gambling. According to a congressional statement by the U.S. deputy assistant attorney general John G. Malcolm (April 29, 2003, http://www.usdoj.gov/criminal/cybercrime/Malcolmtestimony42903.htm), seven hundred Internet gambling sites existed in 1999. By 2003 the U.S. Department of Justice estimated that eighteen hundred gambling sites were in place, bringing in roughly $4.2 billion. In “New Legislation May Pull the Plug on Online Gambling” (USA Today ,October3, 2006), Michael McCarthy and Jon Swartz report that in 2006 online gambling was estimated to be a $12 billion industry worldwide, with as many as twenty-three hundred gambling sites in operation.
Many of these sites allow gamblers to wire money from their checking accounts into a gambling account run by the casino. When the player wishes to gamble, he or she simply goes online and begins a session. Because most of these big online gambling operations are based in foreign nations in the Caribbean or South America, the U.S. government cannot regulate them. Malcolm points to instances where the online houses manipulated the software so that the odds of games such as blackjack are skewed heavily in the house's favor. Other fly-by-night gambling operations had simply run off with people's money. Even when these gambling houses are honest,
they are still perceived as a threat to society by many lawmakers. People addicted to gambling, for example, might log in and gamble unfettered for hours at a time from work or home. They might lose hundreds or thousands of dollars with a few clicks of the mouse.
Malcolm also addresses the issue of money laundering through online casinos. Criminals who make their money from illegal activities such as drugs are known to use online casino accounts to stash their profits. Once the money is in the casino, the crooks use the games themselves to transfer money to their associates. Some criminals set up private tables at online casino sites and then intentionally lose their money to business associates at the table. In other instances, the casino is part of the crime organization. All the criminal has to do in these cases is to lose money to the casino.
In October 2006 Congress approved the Unlawful Internet Gambling Enforcement Act (Title VIII of the Security and Accountability for Every Port Act of 2006), which made it illegal for banks and credit card companies in the United States to make payments to Internet gambling sites, effectively ending online gambling nationwide. According to McCarthy and Swartz, PartyGaming, the world's largest online gambling company, generated 80% of its $1 billion revenue in 2005 from 920,000 active customers in the United States. At the time of the ban, the U.S. market accounted for an estimated 50% to 60% of online gambling worldwide. After the passage of the act, several bills came under consideration to review and revise the regulation of online gaming, but as of mid-2008 no further legislative action had been signed into law. Among the items introduced were the Internet Gambling Study Act, sponsored by Representative Shelley Berkley (1951–; D-NV), which sought to assess the impact of the Unlawful Internet Gambling Enforcement Act on online gaming in the United States, and the Internet Gambling Regulation and Tax Enforcement Act, sponsored by Representative James McDermott (1936–; D-WA), which proposed licensing operators of Internet gambling sites and imposing taxes and fees on both gamers and site operators.
The conversion from analog recordings to digital music during the 1980s changed the way Americans listened to music. Humans talk and listen in analog. When people speak, they create vibrations in their throats that then travel through the air around them like ripples in a pond. A membrane in the ear, known as an eardrum, picks up these vibrations, allowing people to hear. Patterns in these vibrations enable people to differentiate sounds from one another. Before compact discs (CDs) and MP3 files, all music was recorded in analog form. On a record player, the vibrations that create music are impressed in grooves on a vinyl disc. A needle passing over this impression vibrates in the same way, turning those vibrations into electrical waveforms that travel along a wire to an amplifier and into a speaker. With tape players, the analog waveforms are recorded in electronic form nearly verbatim on a magnetic tape.
The biggest problem with analog recordings is that each time the music is recorded or copied, the waveform degrades in quality much like a photocopy of an image. Digitizing the music resolves this problem of fidelity. To record and play music digitally, an analog-to-digital converter (ADC) and a digital-to-analog converter (DAC) are needed. In the recording process, the analog music is fed through the ADC, which samples the analog waveforms and then breaks them down into a series of binary numbers represented by zeros and ones. The numbers are then stored on a disc or a memory chip like any other type of digital information. To play the music back, these numbers are fed through a DAC. The DAC reads the numbers and reproduces the original analog waveform that then travels to the headphones or speakers. Because the numbers always reproduce a high-quality version of the original recording, no quality (fidelity) is lost, regardless of how many times the song is transferred or recorded.
Digital music was first introduced into the U.S. mainstream in 1983 in the form of CDs. Klass Compaan, a Dutch physicist, originally came up with the idea for the CD in 1969 and developed a glass prototype a year later at Philips Corporation. Over the next nine years both Philips and Sony worked on various prototypes of a CD player. In 1979 the two companies came together to create a final version and set the standards for the CD. The first CD players were sold in Japan and Europe in 1982 and then in the United States in 1983.
With a standard CD, music is recorded digitally on the surface of a polycarbonate plastic disc in a long spiral track 0.00002 inches wide that winds from the center of the disk to the outer edges. A space 0.00006 inches (0.00015 cm) wide separates each ring of the spiral track from the one next to it. Tiny divots, or pits, a minimum of 0.00003 inches (0.00008 cm) long, are engraved into the surface of the track. The polycarbonate disc is then covered by a layer of aluminum and then a layer of clear acrylic. (See Figure 5.2.) As the disc spins in the disc drive, a laser follows this tiny track counterclockwise, and a light sensor, sitting next to the laser, tracks the changes in the laser light as it reflects off the CD. When the laser strikes a nondivoted section of track, the laser light bounces off the aluminum and back to the light sensor uninterrupted. However, each time the laser hits one of the divots along the CD track, the light is scattered.
These flashes of light represent the binary code that makes up the music. Electronics in the disc player read this code. The ones and zeros are then fed into a digital signal processor, which acts as a DAC, and the analog waveform for the music moves to the headphones or speakers.
When CD players were first released in the United States by both Sony and Philips in 1982, they were priced close to $900 a piece ($2,040 in 2008 dollars). The CDs themselves, which occupied a small section of the music store at the time, went for close to $20 a piece ($45 in 2008 dollars). Despite the high costs, the Census Bureau indicates in Statistical Abstract of the United States: 2003 (2004, http://www.census.gov/prod/2004pubs/03statab/inforcomm.pdf) that 22.6 million CDs sold in 1985. By 1990, 286.5 million CDs were sold. In Statistical Abstract of the United States: 2008, the Census Bureau reports that by 2000 this number peaked at 942.5 million and then dropped down to 614.9 million by 2006 due to competition from MP3 players. Over the years CD players have become much more compact and have been equipped with many more features, often designed to increase sound quality. By 2008 personal CD players and portable CD stereo units were widely available for less than $50.
Rise of MP3
In 1985 the first CD read-only memory (CD-ROM) players were released for computers, again by Sony and Philips. CD-ROM players can read computer data from CD-ROMs as well as music from CDs. Even though people with early CD-ROMs were able to listen to CD music, downloading it onto a computer was difficult. A three-minute song on a CD consisted roughly of 32 megabytes. (Each byte consists of a string of eight ones and zeros that can be used to represent binary numbers from 0 to 255. In binary, which is a base-two number system, 1 is 00000001, 2 is 00000010, 3 is 00000011, and so on up to 255, which is represented as 11111111.) During the late 1980s and early 1990s most computer hard drives were only big enough to hold a few songs straight from an audio CD. In 1987 researchers at the Fraunhofer Institute for Integrated Circuits in Germany began to look into ways to compress digital video and sound data into smaller sizes for broadcasting purposes. Out of this work, the MP2 (MPEG-1 Audio Layer II) and then the MP3 (MPEG-1 Audio Layer III) audio file formats emerged. Other compression formats, such as Windows Media Audio and Advanced Audio Coding, have come onto the market since then but are not nearly as well known.
Using such compression formats and encoding software, digital songs can be compressed from 32 megabytes per song to as little as 3 megabytes per song. CD recordings pick up any and every sound in a studio or concert. Compression systems, such as MP3, work by cutting out sounds in CD recordings that people do not pay attention to or do not hear. This may include sounds drowned out by louder instruments. In classical music, an MP3 encoder might cut out a nearly indiscernible note from a flautist or the sound of a faint cough in the audience. It then condenses the recording to one-tenth its previous size. When played back, the encoder reconstructs the song. The compressed files sound nearly as good as CD tracks and much better than audiotapes.
At first, these compression formats and encoding software existed only on home computers. In February 1999 Diamond Multimedia released the first hard drive–based music player. Because early players mainly played MP3 formats, all hard drive–based music players, such as the Apple iPod, became known as MP3 players. All MP3 players consist of a hard drive (many were available in 60 to 160 gigabyte range in 2008) and all the electronic circuitry necessary to transform MP3 and other compressed music files into analog music. Using a cable, these players can be hooked up directly to a home computer. Once connected, the user can download thousands of songs onto the hard drive of the MP3 device. When the user selects a song, a microprocessor in the player pulls the song from the hard drive. A built-in signal processor decompresses the MP3 file (or other type of compressed music file) into a digital CD format, converts the digital signal to an analog signal, and then sends the analog wave-form to the headphones. Though compressed music files do not sound quite as good as CD tracks, people can place their entire music collection on a player smaller than the palm of their hand. Since 1999 significant advances in technology have led to major improvements in iPods and other MP3 players. In 2008 most new players had color screens and the ability to play video files, which were typically compressed using a MPEG-4 video compression format.
Hard drive–based music players continue to find their way into the hands of millions of Americans. The Consumer Electronics Association (January 2007 http://www.ce.org/Research/Sales_Stats/1219.asp) reports that sales of MP3 players reached 34.3 million in 2006, which was more than digital camera sales (32.2 million) or personal computers (24.4 million) that year. In The Infinite Dial 2008: Radio's Digital Platforms (2008, http:/www.arbitron.com/downloads/digital_radio_study_2008.pdf), the media and marketing researchers Arbitron Inc. and Edison Digital Media indicate that nearly four out of ten (37%) Americans aged twelve and older owned a portable device on which to play digital music files in 2008; nearly three-quarters (73%) of those in the twelve-to-seventeen age group owned MP3 players.
MP3 and Peer-to-Peer File Sharing
The widespread use of MP3 files and the increased size of hard drives in the late 1990s caused a sea change in the music industry almost as big as the advent of digital music. People suddenly had the ability to store entire music libraries on their computers and swap music for free over peer-to-peer file-sharing networks. According to Michael Gowan, in “Requiem for Napster” (PC World, May 17, 2002), the Napster file-sharing service had approximately eighty million subscribers at its peak. However, the availability of free music cut deeply into the recording industry's sales. The Census Bureau states in Statistical Abstract of the United States: 2006 (2005, http://www.census.gov/prod/2005pubs/06statab/infocomm.pdf) that the recording industry's revenues grew from $11 billion in 1998 to $13.7 billion in 2000. Revenues then fell from $13.7 billion to $13.2 billion in 2001. Revenues did not rise past 2000 levels again until 2003, and they rose at a much slower pace than in the late 1990s. Seeing diminishing profits at the turn of the twenty-first century, the Recording Industry Association of America (RIAA) sued Napster and the users of other peer-to-peer networks who shared music files (see Chapter 4). Some high-profile bands at the time, such as Metallica and Creed, joined the RIAA in its attempt to close down Napster. Other musicians, however, did not seem fazed by Internet file sharing. Radiohead released its album Kid A on the Internet three weeks before it was released in stores. The buzz generated by the Internet prerelease catapulted the album to number one in the United States after it hit record stores. Before Kid A, Radiohead had never had a number-one album in the United States.
In Artists, Musicians, and the Internet (December 5, 2004, http://www.pewinternet.org/pdfs/PIP_Artists.Musicians_Report.pdf), Mary Madden of the Pew/Internet states that musicians had mixed feelings about the impact of the Internet on the music business in 2004. They reported favorable impact in areas such as creativity; ability to reach a wider audience; ability to connect with family, friends, and other musicians while traveling to performances; and ease of scheduling performances and travel arrangements. However, musicians were somewhat divided over the issue of online file sharing. Even though 35% agreed that file-sharing services “are not bad” for musicians because the artist derives some promotional benefit from the download, 23% agreed that downloads “are bad” in that they allow people to obtain copyrighted material without paying for it. In the survey, 35% agreed to both statements, showing the ambivalence of artists themselves over the issue.
Regardless of what musicians thought, the lawsuits brought on by the RIAA succeeded in putting an end to much of the free file swapping on the Internet. The free Napster Web site shut down in May 2000 and reopened a year later as a pay music service where users could buy songs. After the RIAA began going after private citizens, traffic on many of the remaining peer-to-peer sites diminished greatly. The number of people using the noncentralized Kazaa peer-to-peer network dropped precipitously after the RIAA became litigious with file swappers in 2003. (See Figure 5.3.) Lee Rainie et al. report in Data Memo: The State of Music Downloading and File-Sharing Online (April 2004, http://www.pewinternet.org/pdfsPIP_Filesharing_April_04.pdf) that 38% of American adults who downloaded music from the Internet in 2004 had cut back somewhat after the RIAA lawsuits began. The number of Internet users who reported downloading music dropped from 32% in 2002 to 18% in 2004. At the same time, more people turned to pay music services, such as Musicmatch and iTunes. Rainie et al. note that over eleven million American adults visited online music services in 2004. Some 17% of those who reported downloading music were regularly getting their music from paid sources, and 31% were retrieving files from peer-to-peer networks. Mary Madden and Lee Rainie of the Pew/ Internet note in Data Memo: Music and Video Downloading Moves beyond P2P (March 2005, http://www.pewinternet.org/pdfs/PIP_Filesharing_March05.pdf) that by 2005 the number of Internet users who reported downloading music had rebounded slightly to 22% overall. Of these, 27% were using pay services, 20% were regularly obtaining music through e-mail or instant messaging, and 16% continued to use peer-to-peer services such as Kazaa. In “Kids Don't Like CDs: iTunes Store Now #2 Music Retailer” (February 26, 2008, http://arstechnica.com/news.ars/post/20080226-kids-dont-like-cds-itunes-store-now-2-music-retailer.html?rel), Jacqui Cheng finds that person-to-person downloading had leveled off by 2007, to about 19% of online music consumers, with legal downloading gaining market share to 10% of all music sales in 2007. Apple (June 19, 2008, http://www.apple.com/pr/library/2008/06/19itunes.html) announced in 2008 that its online iTunes store had sold more than five billion songs to consumers and had become the world's largest music retailer.
Though inventors had been trying to create a television as early as 1877, many people consider Philo Farnsworth (1906–1971) to be the father of the first modern, electronic television. He demonstrated his device for the first time in San Francisco in 1927, when he transmitted an image of a dollar sign. Using Farnsworth's design, Radio Corporation of America (RCA; then the owner of the National Broadcasting Company [NBC] network) began work on the first commercial television system in the late 1930s. In 1939 the first commercial televisions were introduced. Early televisions had tiny screens and were as big as small dressers. The pictures were in black and white, and at first the major networks only broadcast in the largest cities.
Full-scale broadcasting began nationwide in 1947. According to the Museum of the Moving Image (2006, http:/www.movingimage.us/site/education/content/behind/page17.html), by 1956, 85% of U.S. households owned a television set. By 1980 television had saturated the U.S. market with 97.9% of U.S. households owning a television. (See Table 5.1.)
Since the early 1960s television has been the most popular medium of entertainment for Americans. In There's No Place Like Home to Spend an Evening, Say Most Americans (January 10, 2002, http://www.gallup.com/poll/5164Theres-Place-Like-Home-Spend-Evening-Say-Most-Americans.aspx), Lydia Saad of the Gallup Organization notes that 27% of Americans in a December 1960 poll said their favorite way of spending their evening was in front of a television. Resting, reading, and entertaining and visiting friends were ranked second, third, and fourth, respectively. Television appeared to hit its peak between the mid-1960s and early 1970s. A full 46% of people polled in February 1974 rated watching television as their favorite evening activity, followed by reading (14%), dining out (12%), and the somewhat ambiguous response of “staying at home with the family” (10%).
Joseph Carroll of the Gallup Organization notes in Family Time Eclipses TV as Favorite Way to Spend an Evening (March 10, 2006, http://www.gallup.com/poll21856/Family-Time-Eclipses-Favorite-Way-Spend-Evening.aspx) that by December 2005 watching television had dropped below 1960 levels, with only 22% of people saying that watching television was their favorite leisure activity. Thirty-two percent of respondents said they enjoyed spending time with family the most. Television continued to eclipse every other form of entertainment. Eleven percent of people said they read at night, 1% reported listening to music, and 1% said they worked on the home computer in 2005. (See Figure 5.4.)
Since the 1970s steady advances in cable, satellite, and digital technology have changed the way Americans watch television. Cable television was installed in 19.9% of U.S. households with a television in 1980 (about 15.2 million homes). (See Table 5.1.) This percentage rose dramatically through the early 1990s, leveling out at about two-thirds of American television households by the late 1990s.
Cable television began in 1948 in Mahanoy City, Pennsylvania. John Walson (1915–1993), the owner of an electronics shop, began selling televisions in 1947. However, few customers in the local area wanted to buy
|TABLE 5.1 Utilization of media, selected years 1980-2005|
|source: “Table 1099. Utilization of Selected Media: 1980-2005,” in Statistical Abstract of the United States: 2008, U.S. Department of Commerce, U.S. Census Bureau, Economics and Statistics Administration, 2007, http://www.census.gov/prod/2007pubs/08abstract/infocomm.pdf (accessed July 31, 2008). Non-governmental data compiled from 1980 through 1995, M. Street Corp. as reported by Radio Advertising Bureau; 1980 through 1990, Radio Facts; 1995 through 2005, Radio Marketing Guide and Fact Book for Advertisers; Television Bureau of Advertising, Inc., Trends in Television; Warren Communications News, Television and Cable Factbook; Editor & Publisher, Co., Editor & Publisher International Year Book.|
|[78.6 represents 78,600,000]|
|Households with— Telephone servicea||Percent||93.0||93.3||93.9||94.0||94.6||94.6||95.5||95.5||94.2||92.4|
|Percent of total households||Percent||99.0||99.0||99.0||99.0||99.0||99.0||99.0||99.0||99.0||99.0|
|Average number of sets||Number||5.5||5.6||5.6||5.6||5.6||5.6||5.6||8.0||8.0||8.0|
|Percent of total households||Percent||97.9||98.2||98.3||98.2||98.2||98.2||98.2||98.2||98.2||98.2|
|Television sets in homes||Millions||128||193||217||240||245||248||254||260||268||287|
|Average number of sets per home||Number||1.7||2.0||2.3||2.4||2.4||2.4||2.4||2.4||2.5||2.6|
|Color set households||Millions||63||90||94||99||101||102||105||107||108||109|
|Wired cable televisiond||Millions||15.2||51.9||60.5||67.1||68.6||69.5||73.2||74.4||73.8||73.9|
|Percent of TV households.||Percent||19.9||56.4||63.4||67.5||68.0||68.0||69.4||69.8||68.1||67.5|
|Alternative delivery system (ADS) householdsd||Millions||(NA)||(NA)||4.0||9.4||11.7||14.7||17.4||19.7||21.2||23.3|
|Percent of TV households.||Percent||(NA)||(NA)||4.2||9.3||11.4||14.1||16.3||18.2||19.3||20.8|
|Percent of TV households.||Percent||1.1||68.6||81.0||84.6||85.1||86.2||91.2||91.5||90.8||90.2|
|Commercial radio stations:b AM||Number||4,589||4,987||4,909||4,783||4,685||4,727||4,804||4,802||4,770||4,758|
|Television stations:f Total||Number||1,011||1,442||1,532||1,615||1,663||1,686||1,714||1,730||1,748||1,749|
|Cable television systemsg||Number||4,225||9,575||11,218||10,700||10,400||10,300||9,900||9,400||8,875||(NA)|
|Daily newspaper circulationh||Millions||62.2||62.3||58.2||56.0||55.8||55.6||55.2||55.2||54.6||53.3|
|NA Not available.
aFor occupied housing units. 1980 as of April 1; all other years as of March.
b1980-1995 as of December 31.
cAs of January of year shown. Excludes Alaska and Hawaii.
dWired cable and VCR as of February; ADS for fourth quarter. Excludes Alaska and Hawaii.
eAs of August 2000, September 2001, and October 2003.
fBeginning 1999, as of September.
gAs of January 1.
hAs of September 30.
a television because of the bad reception caused by the surrounding mountains. To increase sales potential, Walson erected an antenna on a nearby mountaintop, ran a cable from the antenna to his store, and connected it to his television. He then agreed to attach cables from his antenna to the houses of those who bought televisions from him. From then until the early 1970s, cable networks were generally only used in rural or mountainous areas. At most, early cable television programming included local broadcasts and a broadcast or two from a nearby region.
As early as 1965 the U.S. government and various contractors began putting up a communications satellite network. A satellite network remedied the biggest obstacle faced by broadcasters during the 1960s, which was the curvature of the earth. If the earth was flat, televisions could receive broadcast signals from thousands of miles away. However, because of the curvature of the earth, these broadcast signals escape into space after traveling about 100 miles (161 km). With a satellite system in place, a transmitter on the East Coast can beam a signal to a satellite above Kansas. The satellite then relays the signal to the West Coast without interruption.
Home Box Office (HBO) became the first pay cable station in 1972 and was the first television broadcaster to take advantage of a satellite communications network. HBO began in Wilkes-Barre, Pennsylvania, and broadcast its movies and shows to a limited number of cable networks in and around the state. In 1975, to expand the subscription television market, HBO leased the right to use one of the uplinks on RCA's Satcom I communications satellite. Once HBO was on the satellite network, any cable network provider around the United States
could buy a 9.8-foot (3-m) satellite dish and provide HBO for any house on the network. By 1978 HBO had one million customers. Ted Turner (1938–), who put his Atlanta-based station, WTBS, on the satellite network in 1976, created the Cable News Network (CNN) in 1980. This was followed by the Music Television Network (MTV) and a number of other stations in 1981, and an era of exponential growth for the cable industry followed. Table 5.2 summarizes the growth in the cable and pay television industry between 1975 and 2007. During this period, basic cable subscriptions skyrocketed 565%, from 9.8 million in 1975 to 65.1 million in 2007. The monthly average cost to basic cable subscribers went up as well, increasing from $6.50 in 1975 to an average monthly cost of $42.72 in 2007. (For comparison purposes, $6.50 in 1975 would be equal to about $26.43 in 2008 dollars.)
The early 1980s also saw the emergence of videocassette recorders (VCRs). Only 1.1% of households had a VCR in 1980. (See Table 5.1.) The percentage of households with VCRs in 1995 jumped to 81% and then proceeded to grow at a slower pace until eventually peaking at 91.5% in 2003; by 2005, with the market turn from videotape to digital video discs, the percent of homes with a VCR had decreased slightly to 90.2%.
Closed Captioning and the V-Chip
By the mid-1970s television had become the predominant medium not only for entertainment but also for news and emergency information. Recognizing this, the Federal Communications Commission (FCC) set aside part of the television broadcast spectrum for closed captioning for the hearing impaired. Closed captioning consists of scrolling text at the bottom of a television screen that spells out what is being said on television. Four years after the FCC action, the American Broadcasting Company (ABC), the Public Broadcasting Service (PBS), and NBC ran their first closed-captioned programs, which included the Wonderful World of Disney and the ABC Sunday Night Movie. At first, closed captioning was scripted, but then the National Captioning Institute developed a keyboard interface that could be used to create captioning for live television shows, sporting events, and newscasts. A stenographer listened to the audio portion of a broadcast and typed it into the broadcast in real time. In 1990 Congress required that all television receivers contain decoders that display closed captioning, and in the 1996 Telecommunications Act Congress insisted that closed captioning be included in all shows by the turn of the twenty-first century.
As part of this act, Congress also requested that broadcasters rate their shows. The industry heeded the request and developed the TV Parental Guidelines. At the start of every television show, a rating is shown in the corner of the screen for fifteen seconds. TV-G designates general audiences, whereas TV-MA warns of content not suitable for anyone under seventeen. Even with the ratings system in place, parents across the country found they still could not monitor all the shows on cable as well as on broadcast television. Gloria Tristani (1953–), the FCC commissioner, said in a statement (March 12, 1998, http://www.fcc.gov/Bureaus/Cable/News_Releases/1998/nrcb8003.html) given to Congress that in 1998 the average child spent twenty-five hours per week in front of the television. In response, the FCC mandated in 2000 that all television sets thirteen inches or larger contain a V-chip. The V-chip worked in conjunction with the ratings system. If a parent set the V-chip to allow only TV-G shows, the microchip would simply block every show not rated TV-G.
|TABLE 5.2 Cable and pay television subscriptions, monthly rates, and revenue, selected years 1975-2007|
|source: Adapted from “Table 1114. Cable and Pay TV—Summary: 1975 to 2006,” in Statistical Abstract of the United States: 2008, U.S. Department of Commerce, U.S. Census Bureau, Economics and Statistics Administration, 2007, http://www.census.gov/compendia/statab/tables/08s1114.pdf (accessed September 2, 2008). Data from the Broadband Cable Financial Databook, 2004, 2005, 2006, 2007, the Cable Program Investor, and Cable TV Investor: Deals & Finance newsletters (monthly), and other publications. Published by SNL Kagan, a division of SNL Financial LC. Additional data for 2007 provided by SNL Kagan|
|[9,800 represents 9,800,000. Cable TV for calendar year. Pay TV as of December 31 of year shown.]|
|Cable TV||Unitsb (1,000)||Monthly rate (dol.)|
|Year||Avg. basic subscribers (1,000)||Avg. monthly basic rate (dol.)||Revenuea (mil. dol.)||Total payc||Pay cable||Non-cable delivered premium||All pay weighted averagec||Pay cable||Non-cable delivered premium|
|NA = Not available.
aIncludes installation revenue, subscriber revenue, and nonsubscriber revenue; excludes telephony and high-speed access.
bIndividual program services sold to subscribers.
cIncludes multipoint distribution service (MDS), satellite TV (STV), multipoint multichannel distribution service (MMDS), satellite master antenna TV (SMATV), C-band satellite, and DBS
satellite. Includes average pay unit price based on data for major premium pay movie services.
Advances in Television in the 1990s
In 1994 a new way of broadcasting movie and television shows became available when RCA released its Direct Satellite System (DSS). The DSS was the first affordable satellite receiver available to the American public. By installing an 18-inch (46-cm) satellite dish on their houses, Americans could receive nearly two hundred channels in their living rooms. To squeeze so many channels into a stream of data small enough to travel through space and back, the direct broadcasting satellite provider had to use a form of digital compression known as MPEG-1. The MPEG-1 compression format works much like an MP3 format. (The MP3 format in fact was developed from the audio portion of MPEG-1 format.) To employ the MPEG-1 format, all television shows recorded in analog must first be transformed by ADCs into a digital format. MPEG-1 encoders, owned by the direct broadcasting satellite provider, then compress the digital data largely by removing redundant scenery between frames. For example, if a character's face movement is the only discernable movement between two frames in a movie, then the background from the first frame is applied to both frames, cutting out the redundant information in the second frame. The compressed signal is then beamed to the satellite network. On receiving the signal, the satellite network broadcasts the signal to homes all across the country. A DSS box in the home decompresses the signal and delivers it to the viewer.
Several years after the first direct broadcast satellites were released, cable companies introduced digital cable. Digital cable works in much the same way as direct broadcast satellites, but uses a slightly more advanced MPEG-2 format for compression. By digitally compressing their programming, cable providers found they could transmit ten times more television stations than before along their cables. Many cable providers added music stations, pay-per-view movies, and multiple movie channels to their services. The only drawback with both the digital cable and satellite systems is that the decoder can only work on one television at a time, and it is usually bulky.
In terms of television accessories, the big development during the late 1990s was the digital video disc (DVD). DVDs work almost exactly in the same way as CDs, but the standard DVD can hold up to seven times more information per disc, which allows the DVD to carry the data needed for much larger video files. Since its release in 1997, the DVD quickly rose to become the preferred format for watching recorded movies. In Americans Inventory Their Gadgets (December 23, 2005, http://www.gallup.com/poll/20593/Americans-Inventory-Their-Gadgets.aspx), Carroll reveals that 83% of American adults had a DVD player by late 2005, which was only five percentage points fewer than those who owned a VCR. According to the research-marketing firm Digital Entertainment Group (DEG), in the press release “DVD Penetration Nears 85 Million Households” (July 11, 2006, http://www.dvdinformation.com/News/press/071106.html), sixty thousand DVD titles were available to U.S. consumers by mid-2006. In addition, the DEG reports in the press release “Home Entertainment Enjoys Another Outstanding Year” (January 7, 2008, http://www.dvdinformation.com/News/press/CES2008yearEnd.htm) that between the launch of the DVD market in 1997 and the end of 2007, 8.9 billion discs had been shipped to U.S. retailers; during the same period 229 million DVD players had been sold in the United States, not including computers and gaming systems equipped to play DVDs.
Many confuse the concept of digital cable with digital television. Digital cable simply uses digital technology to compress the size of broadcasts so the customer has more channels. The digital signal also does not degrade as it travels across miles of coaxial cable. Most of the programming fed through the digital cable systems is not digitally recorded. Digital television, however, is digital from start to finish. Digital cameras are used to record the broadcast; cables, satellite systems, and broadcast towers send a digital signal; and digital televisions play the broadcast. The result is a television picture that more closely resembles an image on a computer monitor than an image on a television set. The FCC has established a number of standards for digital television, which will be mandatory for full-power stations after February 17, 2009. (See Table 5.3.) Standard-definition television (SDTV) has the resolution of an analog television, which is roughly 480 by 440 dots per inch (dpi) or 210,000 pixels in total. The next step up in visual quality is enhanced-definition television (EDTV), which generally has the same overall resolution as SDTV but features a wider screen. Finally, high-definition television (HDTV) is the highest quality television format with resolutions up to 1,920 dpi horizontally and 1,080 dpi vertically. Overall, HDTV has more than two million pixels to display each image, which provides the viewer with ten times the detail of SDTV. For a television to meet HDTV standard, it also has to have the ability to play the latest versions of Dolby stereo.
The digital television standards were adopted by the FCC after Congress passed the 1996 Telecommunications Act. The act called for a full conversion to digital
|TABLE 5.3 Digital television standards|
|source: “Digital Television Facts at a Glance,” in The Digital TV Transition: What You Need to Know about DTV, Federal Communications Commission, July 29, 2008, http://www.dtv.gov/whatisdtv.html (accessed August 7, 2008)|
|Date for final transition to digital is February 17, 2009. After that date, full-power stations will only broadcast digital signals.
Consumers will always be able to connect an inexpensive receiver, a digital to analog converter box, to their existing analog TV to decode DTV broadcast signals.
Digital to analog converter boxes will not convert your analog TV to high-definition.
Analog TVs will continue to work with cable, satellite, VCRs, DVD players, camcorders, video games consoles and other devices for many years.
|Digital cable or digital satellite does not mean a program is in high-definition.
Digital pictures will be free from the “ghosts” and “snow” that can affect analog transmissions.
Multicasting is available.
HDTV is available.
Data streaming is available.
|High-definition broadcasts offered.
Best available picture resolution, clarity and color.
Dolby theatre surround-sound.
Wide screen “movie-like” format
television across the United States by 2006. By that time every television station serving every market in the United States was required to air digital programming in one of the formats described in Table 5.3. In addition, broadcasters no longer had to air analog content. Americans with an analog television set would have been required to buy either a digital television or a $50 to $100 ADC device to watch television. In 2005, however, only a small percentage of Americans had HDTV or even EDTV. Consequently, in 2005 Congress pushed back the deadline for the digitization of the television to February 17, 2009.
On January 1, 2008, the National Telecommunications and Information Administration of the U.S. Department of Commerce began issuing $40 coupons that defrayed the cost of converter boxes, allowing consumers to continue using older, analog television sets after the conversion to digital broadcasting. Households were eligible to receive two coupons and were required to redeem them toward the purchase of converter boxes within ninety days. Table 5.4 shows the number of coupon requests that had been received, processed, and redeemed as of August 26, 2008. As of that time, the TV Converter Box Coupon Program had received 25.2 million requests from 13.4 million households and was averaging about 105,000 applications per day.
The many advances in new media have fundamentally changed the manner in which Americans get their news. Table 5.5 reveals the ways in which people received their
|TABLE 5.4 TV Converter Box Coupon Program, August 2008|
|source: “TV Converter Box Coupon Program Weekly Status Update,” in Program Statistics, U.S. Department of Commerce, National Telecommunications and Information Administration, August 27, 2008|
|Total funds committed||$719,140,849|
|Total funds available||$620,859,151|
|Average daily orders YTD||105,267|
|Average daily orders last 30 days||110,472|
|Average daily orders last week||114,541|
|Retailers/locations||2,489 / 28,198|
|Phone/online retailers||13 / 34|
|Converters/pass-through||146 / 73|
daily news from August 1995 through December 2006. In general, cable news networks, talk radio, and Internet news sources increased in popularity over these years, whereas nightly network news programs and local newspapers showed declines. Evening news programs on ABC, CBS, and NBC took the biggest hit, with viewership dropping from 62% in August 1995 to 35% in December 2006. Daily readership of local newspapers increased slightly between March 1998 and July 1999 to a peak of 54% before declining to 44% in 2004, the point at which it remained in 2006. The biggest increase occurred among Americans who got their daily news from cable television networks, which rose from 23% in August 1995 to 41% in December 2002, before dropping off slightly in 2004 and 2006. The number of people who said they received their news from the Internet increased from 3% in August 1995 to 22% in December 2006. Local television newscasts held fairly steady over the survey period and were the most popular source of daily news as of 2006.
A likely explanation for the decrease in network news viewership and newspaper readership could be that more people are turning to the Internet to get their daily news. The Internet has made getting news much more convenient. With the Internet, hundreds of magazines and news sites can be accessed in seconds. John B. Horrigan of the Pew/Internet notes in Online News (March 22, 2006, http://www.pewinternet.org/pdfs/PIP_News.and.Broadband.pdf) that the number of online Americans who visited news sites daily reached fifty million in 2005. The Internet, however, was not likely driving traditional news organizations out of business, as many had developed Web content that coordinated with their publications or broadcasts in other media. In the press release “Half of All U.S. Internet Users Visited News Sites in June 2006” (August 7, 2006, http://www.comscore.com/press/release.asp?press=971),
|TABLE 5.5 Public opinion on sources of news, by media type and frequency, selected years 1995-2006|
|source: Adapted from Media Use and Evaluation, The Gallup Organization, 2007, http://www.gallup.com/poll/1663/Media-Use-Evaluation.aspx (accessed August 7, 2008). Copyright © 2008 by The Gallup Organization. Reproduced by permission of The Gallup Organization.|
|PLEASE INDICATE HOW OFTEN YOU GET YOUR NEWS FROM EACH OF THE FOLLOWING SOURCES—EVERY DAY, SEVERAL TIMES A WEEK, OCCASIONALLY, OR NEVER. HOW ABOUT—[RANDOM ORDER]?|
|Every day||Several times a week||Occasionally||Never||No option|
|Local newspapers in your area||%||%||%||%||%|
|2006 Dec 11-14||44||13||28||14||*|
|2004 Dec 5-8||44||14||27||15||*|
|2002 Dec 5-8||47||13||26||14||*|
|1999 Jul 22-25||54||14||21||10||1|
|1998 Jul 13-14||53||15||22||10||—|
|1998 Mar 6-9||50||12||26||11||1|
|Nightly network news programson ABC, CBS or NBC|
|2006 Dec 11-14||35||16||30||19||—|
|2006 Apr 7-9||45||19||21||15||*|
|2004 Dec 5-8||36||16||26||22||*|
|2002 Dec 5-8||43||16||25||15||1|
|1999 Jul 22-25||52||18||22||8||—|
|1998 Jul 13-14||55||19||19||7||*|
|1998 Mar 6-9||56||19||17||7||1|
|1995 Aug 11-14||62||20||15||3||*|
|Cable news networks such as CNN, Fox News Channel and MSNBC|
|2006 Dec 11-14||34||16||30||20||*|
|2004 Dec 5-8||39||16||25||20||*|
|2002 Dec 5-8||41||15||26||18||*|
|1999 Jul 22-25||29||22||32||17||*|
|1998 Jul 13-14||21||16||33||29||1|
|1998 Mar 6-9||22||16||34||27||1|
|1995 Aug 11-14||23||20||36||21||—|
|Local television news from TV stations in your area|
|2006 Dec 11-14||55||14||23||8||*|
|2004 Dec 5-8||51||19||19||11||*|
|2002 Dec 5-8||57||16||18||9||*|
|1999 Jul 22-25||58||14||19||9||*|
|1998 Jul 13-14||57||15||19||9||*|
|1998 Mar 6-9||56||17||17||9||1|
|1995 Aug 11-14||55||18||20||7||—|
|Radio talk shows|
|2006 Dec 11-14||20||9||27||44||*|
|2004 Dec 5-8||21||12||25||42||—|
|2002 Dec 5-8||22||10||29||39||*|
|1999 Jul 22-25||12||8||33||47||*|
|1998 Jul 13-14||12||9||21||58||*|
|1998 Mar 6-9||11||5||25||58||1|
|1995 Aug 11-14||12||5||35||48||—|
|News on the Internet|
|2006 Dec 11-14||22||11||24||43||*|
|2004 Dec 5-8||20||6||25||49||*|
|2002 Dec 5-8||15||8||27||50||*|
|1999 Jul 22-25||8||7||23||62||*|
|1998 Jul 13-14||7||6||17||70||*|
|1998 Mar 6-9||5||6||18||70||1|
|1995 Aug 11-14||3||3||12||82||—|
comScore Media Metrix notes that 94.1 million Americans (54% of Internet users in the United States, according to the firm's estimate) accessed a news site online during June 2006. Top news sites at that time included Yahoo! News, which had 31.2 million unique visitors, MSNBC (23.4 million), AOL News (20.4 million), and CNN (19.9 million).
Besides gaining popularity among Internet users, online news and information sources have also begun bringing in serious revenue. Online publishers generated more than $10.3 billion in 2005, an 18.9% increase over 2004. (See Table 5.6.) Of this sum, $4.8 billion was attributed to publishing and broadcasting content on the Internet, $2 billion was received from online advertisers, $479 million was paid in licensing for intellectual property, and $3.1 billion was generated by other online operations.
High technology has not only changed how news and information are sold but also how reporters and writers do their jobs. Advanced communications and video technology have allowed reporters with established organizations to report from anywhere in the world in real time, something that most modern viewers take for granted. With the Internet, anyone can report on current events or start a publication or Web log (blog) and begin writing commentary. No longer do reporters and writers have to work for a large publishing house or magazine to build a reputation. In Bloggers: A Portrait of the Internet's New Storytellers (July 19, 2006, http://www.pewinternet.org/pdfs/PIP%20Bloggers%20Report%20July%2019%202006.pdf), Amanda Lenhart and Susannah Fox of the Pew/ Internet report that 8% of Internet users in the United States had established blogs by 2006. Of these, about 5% of bloggers indicated that news and current events were the main focus of their blogs. Even though 65% of bloggers did not consider themselves professional journalists, 29% reported that motivating others to action was a major reason for writing their blog and 27% indicated that influencing the way other people think was an important reason for their blog.
Some blogs have grown so popular that they have a readership bigger than some major newspapers and magazines. For example, Matt Drudge (1967–) began the Drudge Report Web site/blog in 1997 to report on current events. He was largely responsible for breaking the story of President Bill Clinton's (1946–) relationship with the former White House intern Monica Lewinsky (1973–) in
|TABLE 5.6 Online publishing and broadcasting-estimated revenue and expenses, 2004-05|
|source: “Table 1116. Internet Publishing and Broadcasting—Estimated Revenue and Expenses: 2004 and 2005, ” in Statistical Abstract of the United States: 2008 ,U .S. Department of Commerce, U.S. Census Bureau, Economics and Statistics Administration, 2007, http://www.census.gov/prod/2007pubs/08abstract/infocomm.pdf (accessed July 31, 2008)|
|[In millions of dollars ]|
|Item||2004||2005||Percent changed 2004-2005|
|Source of revenue:|
|Publishing and broadcasting of content on the Internet||4,482||4,763||6.3|
|Online advertising space||1,525||1,969||29.1|
|Licensing of rights to use intellectual property||384||479||24.7|
|All other operating revenue||2,303||3,128||35.8|
|Breakdown of revenue by type of customer: Government||(S)||(S)||(S)|
|Business firms and not-for-profit organizations||6,330||7,405||17.0|
|Household consumers and individual users||2,022||2,416||19.5|
|Gross annual payroll||2,747||3,260||18.7|
|Employer's cost for fringe benefits||360||419||16.4|
|Temporary staff and leased employee expense||251||160||-36.3|
|Expensed materials, parts and supplies (not for resale)||268||224||-16.4|
|Expensed purchase of other materials, parts and supplies||166||(S)||(S)|
|Expensed purchased services||1,227||1,401||14.2|
|Expensed purchases of software||61||66||8.2|
|Purchased electricity and fuels (except motor fuel)||20||17||-15.0|
|Lease and rental payments||303||290||-4.3|
|purchased repair and maintenance||36||77||113.9|
|purchased advertising and promotional services||807||951||17.8|
|Other operating expenses||2,730||3,455||26.6|
|Depreciation and amortization charges||360||614||70.6|
|Government taxes and license fees||27||85||214.8|
|All other operating expenses||2,343||2,755||17.6|
|S = Data do not meet publication standards|
1998. By October 2008, the Web site (October 24, 2008, http://www.drudgereport.com/) reported more than 6.8 billion visits in the previous twelve months and a daily tally of 27.2 million.