A worldwidetelecommunicationsnetwork of business, government, and personal computers.
The internet is a network of computers linking the United States with the rest of the world. Originally developed as a way for U.S. research scientists to communicate with each other, by the mid 1990s the Internet had become a popular form of telecommunication for personal computer users. The dramatic growth in the number of persons using the network heralded the most important change in telecommunications since the introduction of television in the late 1940s. However, the sudden popularity of a new, unregulated communications technology raised many issues for U.S. law.
The Internet, popularly called the Net, was created in 1969 for the U.S. defense department. Funding from the Advanced Research Projects Agency (ARPA) allowed researchers to experiment with methods for computers to communicate with each other. Their creation, the Advanced Research Projects Agency Network (ARPANET), originally linked only four separate computer sites at U.S. universities and research institutes, where it was used primarily by scientists.
In the early 1970s, other countries began to join ARPANET, and within a decade it was widely accessible to researchers, administrators, and students throughout the world. The National Science Foundation (NSF) assumed responsibility for linking these users of ARPANET, which was dismantled in 1990. The NSF Network (NSFNET) now serves as the technical backbone for all Internet communications in the United States.
The Internet grew at a fast pace in the 1990s as the general population discovered the power of the new medium. A significant portion of the Net's content is written text, in the form of both electronic mail (e-mail) and articles posted in an electronic discussion forum known as the Usenet news groups. In the mid-1990s the appearance of the World Wide Web made the Internet even more popular. The World Wide Web is a multimedia interface that allows for the transmission of text, pictures, audio, and video together, known as web pages, which commonly resemble pages in a magazine. Together, these various elements have made the Internet a medium for communication and for the retrieval of information on virtually any topic.
The sudden growth of the Internet caught the legal system unprepared. Before 1996, Congress had passed little legislation on this form of telecommunication. In 1986, Congress passed the Electronic Communications Privacy Act (ECPA) (18 U.S.C.A. § 2701 et seq. ), which made it illegal to read private e-mail. The ECPA extended most of the protection already granted to conventional mail to electronic mail. Just as the post office may not read private letters, neither may the providers of private bulletin boards, on-line services, or Internet access. However, law enforcement agencies can subpoena e-mail in a criminal investigation. The ECPA also permits employers to read their workers' e-mail. This provision was intended to protect companies against industrial spying, but it has generated lawsuits from employees who objected to the invasion of their privacy. Federal courts, however, have allowed employers to secretly monitor an employee's e-mail on a company-owned computer system, concluding that employees have no reasonable expectation of privacy when they use company e-mail.
Should the Internet Be Policed?
Few observers could have predicted the fuss that the Internet began to generate in political and legal circles in the mid-1990s. After all, the global computer network linking 160 countries was hyped relentlessly in the media in the early 1990s. It spawned a multimillion-dollar industry in Internet services and a publishing empire devoted to the online experience—not to mention Hollywood movies, newspaper columns, and new jargon. But the honeymoon did not last. Like other communications media before it, the Internet provoked controversy about what was actually sent across it. Federal and state lawmakers proposed crackdowns on its content. Prosecutors took aim at its users. Civil liberties groups fought back. As the various factions engaged in a tug-of war over the future of this sprawling medium, the debate became a question of freedom or control: should the Internet be left alone as a marketplace of ideas, or should it be regulated, policed, and ultimately "cleaned up"? Although this question became heated during the early- to mid-1990s, it has remained a debated issue into the early 2000s.
More than three decades after defense department contractors put it up, the network remains free from official control. This system has no central governing authority for a very good reason: the general public was never intended to use it. Its designers in the late 1960s were scientists. Several years later, academics and students around the world got access to it. In the 1990s, millions of people in U.S. businesses and homes signed on. Before the public signed on its predecessors had long since developed a kind of Internet culture—essentially, a freewheeling, anything-goes setting. The opening of the Internet to everyone from citizens to corporations necessarily ruptured this formerly closed society, and conflicts appeared.
Speech rights quickly became a hot topic of debate. The Internet is a communications medium, and people have raised objections to speech online just as they have to speech in the real world. The Internet allows for a variety of media—text, pictures, movies, and sound—and pornography is abundantly accessible online in all these forms. It is commonly "posted" as coded information to a part of the Internet called Usenet, a public issues forum that is used primarily for discussions. With over 10,000 topic areas, called news groups, Usenet literally caters to the world's panoply of interests and tastes. Certain news groups are devoted entirely to pornography. As the speed of the Internet increased dramatically with the development of broadband access in the late 1990s and early 2000s, not only has more of this type of information become more available, but also users have been able to access this information in greater quantity.
Several signs in 1994 predicted a legal crackdown on the Internet. Early on, U.S. attorney general janet reno said criminal investigators were exploring the originators of online child pornography. In July 1994, federal prosecutors won an obscenity conviction in Tennessee against the operators of a computer bulletin board system (BBS) called the Amateur Action BBS, a private porn subscription service. Quickly becoming a cause célèbre in the online world, the case raised the question of how far off a general Internet crackdown could be.
In December 1994, a college student's fiction raised a furor. Jake Baker, a sophomore in linguistics at the University of Michigan, published a story about sexual torture in the alt.sex.stories news group on Usenet. Its lurid detail was not unique in the news group, but something else was: Baker used the name of a female classmate for one of his fictional victims. Once the name was recognized, campus critics of pornography lashed out at Baker.
Baker's case demonstrated how seriously objections to Internet material would be taken. In January 1995, the University of Michigan opened an investigation, and soon, federal bureau of investigation agents began reviewing Baker's e-mail. Baker insisted he meant no harm, suggesting that he wanted to be a creative writer. He even submitted to a psychological profile, which determined that he posed no danger to the student named in his story or to anyone else. But on February 9, 1995, federal authorities arrested him. He was charged with five counts of using inter-state communications to make threats to injure—and kidnap—another person. Lacking any specific target for Baker's alleged threats, yet armed with allegedly incriminating e-mail, prosecutors charged that he was dangerous to other university students. The american civil liberties union (ACLU) came to his aid, arguing in an amicus brief that the accusations were baseless and moreover violated Baker's first amendment rights. A U.S. district court judge threw out the case.
The U.S. Senate had its own ideas about online speech. In February 1995, Senator J. James Exon (D-NE) introduced the Communications Decency Act (S. 314, 104th Cong., 1st Sess. ). Targeting "obscene, lewd, lascivious, filthy, or indecent" electronic communications, the bill called for two-year prison sentences and fines of up to $100,000 for anyone who makes such material available to anyone under the age of 18. In its original form, the bill would have established broad criminal liability: users, online services, and the hundreds of small businesses providing Internet accounts would all be required to keep their messages, stories, postings, and e-mail decent. After vigorous protest from access providers, the bill was watered down to protect them: they would not be held liable unless they knowingly provided indecent material.
Several groups lined up to stop the Decency Act. Opposition came from civil liberties groups including the ACLU, the electronic frontier foundation (EFF), and Computer Professionals for Social Responsibility, as well as from online services and Internet access providers. They argued that the bill sought to criminalize speech that is constitutionally protected under the First Amendment.
Although Congress eventually outlawed obscene and other forms of indecent sexual material on the Internet in the Communications Decency Act of 1996, 47 U.S.C.A. § 223, the statute was challenged immediately. In Reno v. American Civil Liberties Union, 521 U.S. 844, 117 S. Ct. 2329, 138 L. Ed. 2d 874 (1997), the Supreme Court found that most of the statute's provisions violated the First Amendment. Congress subsequently sought to focus its attention on legislation that proscribes the transmission of child pornography, though the Supreme Court in a series of cases found that these statutes were likewise unconstitutional.
The central concern in Reno and the subsequent cases was that Congress has prohibited constitutionally protected speech in addition to speech that is not afforded First Amendment protection. Some members of Congress and supporters of such legislation suggested that restrictions on obscene and indecent information are necessary in order to protect children who use the Internet. But opponents of these restrictions noted that the Internet cannot be reduced to include only that information that is appropriate for children, and the Supreme Court reached this precise conclusion.
Although the debate about whether the government should regulate pornography and other obscene material continued, much of the focus about Internet policing shifted to other issues that involve the Internet. One important issue has been how the government can protect copyright and other intellectual property owners from piracy that is somewhat common on the medium. Another major issue is how the government can prevent the dissemination of unwanted advertising, usually sent through e-mail and commonly referred to as spam. Likewise, computer viruses have caused millions of dollars of damages to computer owners in the United States and worldwide in the 1990s and 2000s, and most of these viruses have been distributed through the Internet.
Many Internet users, some of whom may otherwise object to government regulation of the medium, view governmental regulation that protects users from such problems as piracy, viruses, and spam more favorably than other forms of regulation. Nevertheless, even regulation of computer crime raises issues, such as whether such regulation may violate users' First Amendment rights or how government regulation protecting against these harms can be effective. As the Internet continues to develop, and even as the medium gradually becomes more standardized, these questions largely remain unanswered.
Crandall, Robert W., and James H. Alleman, eds. 2002. Broadband: Should We Regulate High-Speed Internet Access? Washington, D.C.: AEI-Brookings Joint Center for Regulatory Studies.
Federal Trade Commission. 1999. Self-Regulation and Privacy Online: A Report to Congress. Washington, D.C.: Federal Trade Commission.
Criminal activity on the Internet generally falls into the category of computer crime. It includes so-called hacking, or breaking into computer systems, stealing account passwords and credit-card numbers, and illegally copying intellectual property. Because personal computers can easily copy information—including everything from software to photographs and books—and the information can be sent anywhere in the world quickly, it has become much more difficult for copyright owners to protect their property.
Public and legislative attention, especially in the mid to late 1990s, focused on Internet content, specifically sexually explicit material. The distribution of pornography became a major concern in the 1990s, as private individuals and businesses found an unregulated means of giving away or selling pornographic images. As hard-core and child pornography proliferated, Congress sought to impose restrictions on obscene and indecent content on the Internet.
In 1996, Congress responded to concerns that indecent and obscene materials were freely distributed on the Internet by passing the Communications Decency Act (CDA) as part of the Telecommunications Act of 1996, Pub. L. No. 104-104, 110 Stat. 56. This law forbade the knowing dissemination of obscene and indecent material to persons under the age of 18 through computer networks or other telecommunications media. The act included penalties for violations of up to five years imprisonment and fines of up to $250,000.
The american civil liberties union (ACLU) and online Internet services immediately challenged the CDA as an unconstitutional restriction on freedom of speech. A special three-judge federal panel in Pennsylvania agreed with these groups, concluding that the law was overbroad because it could limit the speech of adults in its attempt to protect children. American Civil Liberties Union v. Reno, 929 F. Supp. 824 (E.D. Pa. 1996).
The government appealed to the U.S. Supreme Court, but the Court affirmed the three-judge panel on a 7-2 vote, finding that the act violated the first amendment. Reno v. American Civil Liberties Union, 521 U.S. 844, 117 S. Ct. 2329, 136 L. Ed. 2d 236 (1997). Though the Court recognized the "legitimacy and importance of the congressional goal of protecting children from the harmful materials" on the Internet, it ruled that the CDA abridged freedom of speech and that it therefore was unconstitutional.
Justice john paul stevens, writing for the majority, acknowledged that the sexually explicit materials on the Internet range from the "modestly titillating to the hardest core." He concluded, however, that although this material is widely available, "users seldom encounter such content accidentally." In his view, a child would have to have "some sophistication and some ability to read to retrieve material and thereby to use the Internet unattended." He also pointed out that systems for personal computers have been developed to help parents limit access to objectionable material on the Internet and that many commercial web sites have age-verification systems in place.
Turning to the CDA, Stevens found that previous decisions of the Court that limited free speech out of concern for the protection of children were inapplicable. The CDA differed from the laws and orders upheld in the previous cases in significant ways. The CDA did not allow parents to consent to their children's use of restricted materials, and it was not limited to commercial transactions. In addition, the CDA failed to provide a definition of "indecent," and its broad prohibitions were not limited to particular times of the day. Finally, the act's restrictions could not be analyzed as forms of time, place, and manner regulations because the act was a content-based blanket restriction on speech. Accordingly, it could not survive the First Amendment challenge.
In 1998, Congress responded to the decision by enacting the Child Online Protection Act (COPA), Pub. L. No. 105-277, 112 Stat. 2681. This act was narrower in its application than the CDA, applying only to commercial transactions and limited to content deemed to be "harmful to minors." The new statute was subject to immediate litigation. A federal district court placed a preliminary injunction on the application of the statute, and this decision was affirmed by the U.S. Court of Appeals for the Third Circuit. American Civil Liberties Union v. Reno, 217 F.3d 162 (3d Cir. 2000). Although the U.S. Supreme Court vacated the decision, it was due to procedural grounds rather than the merits of the challenge. Ashcroft v. American Civil Liberties Union, 535 U.S. 564, 122 S. Ct. 1700, 152 L. Ed. 2d 771 (2002). On remand, the Third Circuit again affirmed the injunction, holding that that statute likely violated the First Amendment. American Civil Liberties Union v. Ashcroft, 322 F.3d 240 (3d Cir. 2003).
The questions raised in Reno and subsequent decisions have also been raised in the debate over the use of Internet filters. Many schools and libraries, both public and private, have installed filters that prevent users from viewing vulgar, obscene, pornographic, or other types of materials deemed unsuitable by the institution installing the software.
The ACLU, library associations, and other organizations that promote greater access to information have objected to the use of these filters, especially in public libraries. The first reported case involving libraries and Internet filters occurred in Mainstream Loudon v. Board of Trustees of the London County Library, 24 F. Supp. 2d 552 (E.D. Va. 1998). A Virginia federal court judge in that case ruled that the use of screening software by a library was unconstitutional, as it restricted adults to materials that the software found suitable for children. Courts have generally been split about his issue, and several have found that the use of these filters in public schools is allowed under the First Amendment.
Pornography is not the only concern of lawmakers and courts regarding potential crime on the Internet. The Internet has produced forms of terrorism that threaten the security of business, government, and private computers. Computer "hackers" have defeated computer network "firewalls" and have vandalized or stolen electronic data. Another form of terrorism is the propagation and distribution over the Internet of computer viruses that can corrupt computer software, hardware, and data files. Many companies now produce virus-checking software that seeks to screen and disable viruses when they arrive in the form of an e-mail or e-mail file attachment. However, computer hackers are constantly inventing new viruses, thus giving the viruses a window of time to wreak havoc before the virus checkers are updated. Moreover, the fear of viruses has led to hoaxes and panics.
One of the most infamous viruses, dubbed the Melissa virus, was created in 1999 by David Smith of New Jersey. It was sent through a Usenet newsgroup as an attachment to a message the purported to provide passwords for sexrelated web sites. When the attachment was opened, it infected the user's computer. The program found the user's address book and sent a mass message with attachments containing the virus. Within a few days, it had infected computers across the globe and forced the shutdown of more than 300 computer networks from the heavy loads of e-mail that Melissa generated.
The Melissa virus represented one of the first instances where law enforcement personnel were able to take advantage of new technologies to track the creator of the virus. On April 1, 1999, about a week after the virus first appeared on the Usenet newsgroups, police arrested Smith. He pled guilty to one count of computer fraud and abuse. He was sentenced to 20 months in prison and was fined $5,000.
Another area of legal concern is the issue of libel. In tort law, libel and slander occur when the communication of false information about a person injures the person's good name or reputation. Where the traditional media are concerned, it is well settled that libel suits provide both a means of redress for injury and a punitive corrective against sloppiness and malice. Regarding communication on the Internet, however, there is little case law, especially on the key issue of liability.
In suits against newspapers, courts traditionally held publishers liable, along with their reporters, because publishers were presumed to have reviewed the libelous material prior to publication. Because of this legal standard, publishers and editors are generally careful to review anything that they publish. However, the Internet is not a body of material that is carefully reviewed by a publisher, but an unrestricted flood of information. If a libelous or defamatory statement is posted on the Internet, which is owned by no one, the law is uncertain as to whether anyone other than the author can be held liable.
Some courts have held that online service providers, companies that connect their subscribers to the Internet, should be held liable if they allow their users to post libelous statements on their sites. An online provider is thus viewed like a traditional publisher.
Other courts have rejected the publisher analogy and instead have compared Internet service providers to bookstores. Like bookstores, providers are distributors of information and cannot reasonably be expected to review everything that they sell. U.S. libel law gives greater protection to bookstores because of this theory (Smith v. California, 361 U.S. 147, 80 S. Ct. 215, 4 L. Ed. 2d 205 ), and some courts have applied it to online service providers.
trademark infringement on the Internet has also led to controversy and legal disputes. One of the biggest concerns for registered trademark and service mark holders is protection of the mark on the Internet. As Internet participants establish sites on the Web, they must create domain names, which are names that designate the location of the web site. Besides providing a name to associate with the person or business that created the site, a domain name makes it easy for Internet users to find a particular home page or web site.
As individuals and businesses devised domain names in this medium, especially during the mid to late 1990s, they found that the names they created were similar to, or replicas of, registered trademarks and service marks. Several courts have considered complaints that use of a domain name violated the rights of a trademark or service mark holder, and early decisions did not favor these parties' rights.
In 1999, Congress enacted the Anti-cyber-squatting Consumer Protection Act, Pub. L. No. 106-113, 113 Stat. 1501. The act strengthened the rights of trademark holders by giving these owners a cause of action against so-called "cybersquatters" or "cyberpirates," individuals who register a third-party's trademark as a domain name for the purpose of selling it back to the owner for a profit.
Prior to the enactment of this law, an individual could register a domain name using the trademark or service mark of a company, and the company would have to use a different domain name or pay the creator a sum of money for the right to use the name. Thus, for example, an individual could register the name www.ibm.com, which most web users would have associated with International Business Machines (IBM), the universally recognized business. Because another individual used this domain name, IBM could not create a Web site using www.ibm.com without paying the cyber-squatter a fee for its use. The 1999 legislation eradicated this problem.
During the 1990s, a number of companies were formed that operated completely on the Internet. Due to the overwhelming success of these companies, the media dubbed this phenomenon the "dot-com bubble." The success of these companies was relatively short-lived, as the "bubble" burst in early 2000. Many of these Internet companies went out of business, while those that remained had to reconsider new business strategies.
Notwithstanding these setbacks, the Internet itself has continued to develop and evolve. During the 1990s, the vast majority of Internet users relied upon telephone systems to log on to the Internet. This trend has changed drastically in recent years, as many users have subscribed to services that provide broadband access through such means as cable lines, satellite feeds, and other types of high-speed networks. These new methods for connecting to the Internet allow users to retrieve information at a much faster rate of speed. They will likely continue to change the types of content that are available through this means of telecommunications.
"ACLU Analysis of the Cox/Wyden Bill (HR 1978)." July 10, 1995. American Civil Liberties Union site. Available online at <www.aclu.org> (accessed November 20, 2003).
"ACLU Cyber-Liberties Alert: Axe the Exon Bill!" April 29, 1995. American Civil Liberties Union site. Available online at <www.aclu.org> (acccessed November 20, 2003).
"A Civil Liberties Ride on the Information Superhighway." 1994. Civil Liberties: The National Newsletter of the ACLU 380 (spring).
"Amicus Curiae Brief in re U.S. v. Jake Baker and Arthur Gonda, Crim. No. 95-80106, U.S. District Court Eastern District of Michigan Southern Division." April 26, 1995. American Civil Liberties Union site. Available online at <www.aclu.org> (accessed November 20, 2003).
Blanke, Jordan M. 2003."Minnesota Passes the Nation's First Internet Privacy Law." Rutgers Computer & Technology Law Journal 29 (summer).
"Can the Use of Cyberspace Be Governed?" 1995. Congressional Quarterly Researcher (June 30).
"Constitutional Problems with the Communications Decency Amendment: A Legislative Analysis by the EFF." June 16, 1995. Electronic Frontier Foundation site. Available online at <www.eff.org> (accessed November 20, 2003).
"Legislative Update: Pending State Legislation to Regulate Online Speech Content." April 17, 1995. American Civil Liberties Union site. Available online at <www.aclu.org> (accessed November 20, 2003).
Leiter, Richard A. 2003. "The Challenge of the Day: Permanent Public Access." Legal Information Alert 22 (February): 10.
Peck, Robert S. 2000. Libraries, the First Amendment, and Cyberspace: What You Need to Know. Chicago: American Library Association.
Peters, Robert. 2000. "'Marketplace of Ideas' or Anarchy: What Will Cyberspace Become?" Mercer Law Review 51 (spring): 909–17.
"Prodigy Stumbles as a Forum Again." Fall 1994. Electronic Frontier Foundation site. Available online at <www.eff.org> (accessed November 20, 2003).
Reed, Cynthia K., and Norman Solovay. 2003. The Internet and Dispute Resolution: Untangling the Web. New York: Law Journal Press.
Smith, Mark, ed. 2001. Managing the Internet Controversy. New York: Neal-Schuman Publishers.
Tsai, Daniel, and John Sullivan. 2003. "The Developing Law of Internet Jurisdiction." The Advocate 61 (July).
"Internet." West's Encyclopedia of American Law. 2005. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3437702372.html
"Internet." West's Encyclopedia of American Law. 2005. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3437702372.html
The Internet is the world's largest computer network. It is a global information infrastructure comprising millions of computers organized into hundreds of thousands of smaller, local networks. The term “information superhighway” is sometimes used to describe the function that the Internet provides: an international, high-speed telecommunications network that offers open access to the general public.
The Internet provides a variety of services, including electronic mail (e-mail), the World Wide Web (WWW), Intranets, File Transfer Protocol (FTP), Telnet (for remote login to host computers), and various file-location services.
HISTORY OF THE INTERNET
The idea for the Internet began in the early 1960s as a military network developed by the U.S. Department of Defense's Advanced Research Project Agency (DARPA). At first, it was a small network called ARPANET, which promoted the sharing of super-computers amongst military researchers in the United States. A few years later, DARPA began to sponsor research into a cooperative network of academic time-sharing computers. By 1969, the first ARPANET hosts were constructed at Stanford Research Institute, University of California, Los Angeles (UCLA), University of California Santa Barbara, and the University of Utah.
A second factor in growth was the National Science Foundation's NSFNET, built in 1986 for the purpose of connecting university computer science departments. NSFNET combined with ARPANET to form a huge backbone of network hosts. This backbone became what we now think of as the Internet (although the term “Internet” was used as early as 1982).
The explosive growth of the Internet came with major problems, particularly related to privacy and security in the digital world. Computer crime and malicious destruction became a paramount concern. One dramatic incident occurred in 1988 when a program called the “Morris worm” temporarily disabled approximately 10 percent of all Internet hosts across the country. The Computer Emergency Response Team (CERT) was formed in 1988 to address such security concerns.
In 1990, as the number of hosts approached 300,000, the ARPANET was decommissioned, leaving only the Internet with NSFNET as its sole backbone. The 1990s saw the commercialization of the Internet, made possible when the NSF lifted its restriction on commercial use and cleared the way for the age of electronic commerce.
Electronic commerce was further enhanced by new applications being introduced to the Internet. For example, programmers at the University of Minnesota developed the first point-and-click method of navigating Internet files in 1991. This program, which was freely distributed on the Internet, was called Gopher, and gave rise to similar applications such as Archie and Veronica.
An even more influential development, also started in the early 1990s, was Tim Berners-Lee's work on the World Wide Web, in which hypertext-formatted pages of words, pictures, and sounds promised to become an advertiser's dream come true. At the same time, Marc Andreessen and colleagues at the National Center for Supercomputing Applications (NCSA), located on the campus of University of Illinois at Urbana-Champaign, were developing a graphical browser for the World Wide Web called Mosaic (released in 1993), which would eventually evolve into Netscape.
By 1995, the Internet had become so commercialized that most access to the Internet was handled through Internet service providers (ISPs), such as America Online and Netcom. At that time, NSF relinquished control of the Internet, which was now dominated by Web traffic.
Partly motivated by the increased commercial interest in the Internet, Sun Microsystems released an Internet programming language called Java, which promised to radically alter the way applications and information can be retrieved, displayed, and used over the Internet.
By 1996, the Internet's twenty-fifth anniversary, there were 40 million Internet users; by 2002, that number had increased to 531 million, and by 2006 the number of Web users was roughly 750 million. Internet-based electronic commerce has reached major proportions as well, totalling roughly $140 million in revenue in the United States alone in 2007. This number continues to rise steadily throughout the 2000s.
Bandwidth is the capacity of a particular pathway to transmit information for online purposes. It is bandwidth that controls how fast Web sites download. In analog settings (such as dial-up), bandwidth is measured by frequency, the difference between the highest and lowest frequencies, expressed in Hertz. Digital lines measure bandwidth in bits/bytes per second (the amount of information transferred every second). Companies often determine and set the amount of bandwidth allowed for certain activities, an activity called bandwidth allocation.
There are many types of Internet connections, which have changed in sophistication and speed throughout the Internet's history. The first kind is the analog connection, or dial-up, one of the cheapest and slowest ways to connect. The computer dials a phone number to access the network and the modem can convert the data to either format, as required. This analog format is the slowest connection, and the one most subject to quality issues. ISDN, or integrated services digital network, is the international format for normal phone-related Internet connections. B-ISDN is a more recent format for other phone connections, such as fiber optics.
DSL is a constant connection that will take up the phone line the way an analog connection does. There are two main types of DSL—ADSL, which is used most commonly in America, and SDSL, which can transmit a larger amount of information and is more often found in Europe.
Others receive Internet through cable, a broadband connection that operates through TV lines. Certain TV channels are used to take and receive Internet information, and since these coaxial cable connections can handle a much higher rate of data than phone lines, cable Internet service tends to be faster.
Wireless Internet is also becoming popular—connecting computers to the Internet through radio-wave transmissions. This requires a wireless hub or router that transmits information into radio waves, but the connection can be accessed from anywhere in the radius of the broadcast.
Electronic mail, or e-mail, is the most widely used function used on the Internet today. Millions of messages are passed via Internet lines every day throughout the world. Compared to postal service, overnight delivery companies, and telephone conversations, e-mail via the Internet is extremely cost-effective and fast. E-mail facilities include sending and receiving messages, the ability to broadcast messages to several recipients at once, storing and organizing messages, forwarding messages to other interested parties, maintaining address books of e-mail partners, and even transmitting files (called “attachments”) along with messages.
Internet e-mail messages are sent to an e-mail address. The structure of an e-mail address is as follows: PersonalID@DomainName
The personal identifier could be a person's name or some other way to uniquely identify an individual. The domain is an indicator of the location of that individual, and appears to the right of the “at” (@) sign. A domain name is the unique name of a collection of computers that are connected to the Internet, usually owned by or operated on behalf of a single organization (company, school, or agency) that owns the domain name. The domain name consists of two or more sections, each separated by a period.
From right-to-left, the portions of the domain name are more general to more specific in terms of location. In the United States, the rightmost portion of a domain is typically one of the following:
- com—indicating a commercial enterprise
- edu—indicating an educational institution
- gov—indicating a governmental body
- mil—indicating a military installation
- net—indicating a network resource
- org—indicating a nonprofit organization
In non-U.S. countries, the rightmost portion of a domain name is an indicator of the geographic origin of the domain. For example, Canadian e-mail addresses end with the abbreviation “ca.”
WORLD WIDE WEB
The World Wide Web (WWW) is a system and a set of standards for providing a graphic user interface (GUI) to Internet communications. The Web is the single most important factor in the popularity of the Internet, because it makes the technology easy to use and gives attractive and entertaining presentation to users.
Graphics, text, audio, animation, and video can be combined on Web pages to create dynamic and highly interactive access to information. In addition, Web pages can be connected to each other via hyperlinks. These hyperlinks are visible to the user as highlighted text, underlined text, or images that the user can click to access another Web page.
Browsers. Web pages are available to users via Web browsers, such as Mozilla/Firefox, Apple's Safari, Opera, or Microsoft's Internet Explorer. Browsers are programs that run on the user's computer and provide the interface that displays the graphics, text, and hyperlinks to the user. Browsers recognize and interpret the programming language called Hypertext Markup Language (HTML). HTML includes the ability to format and display text; size and position graphics images for display; invoke and present animation or video clips; and run small programs, called applets, for more complex interactive operations. Browsers also implement the hyperlinks and allow users to connect to any Web page they want.
Search Engines. Sometimes a user knows what information she needs, but does not know the precise Web page that she wants to view. A subject-oriented search can be accomplished with the aid of search engines, which are tools that can locate Web pages based on a search criterion established by the user. By far, Google is the most commonly used search engine.
Blogs. The ease with which users can publish their own information using the World Wide Web has created an opportunity for everyone to be a publisher. An outcome from this is that every topic, hobby, niche, and fetish now has a thriving community of like-minded people. The ease of publishing information on the Web became easier with the advent of Web logs or “blogs,” online diaries that opened the floodgates to an even greater level of individual participation in information sharing and community.
UNIFORM RESOURCE LOCATORS (URL)
A Uniform Resource Locator (URL) is a networked extension of the standard filename concept. It allows the user to point to a file in a directory on any machine on the Internet. In addition to files, URLs can point to queries, documents stored deep within databases, and many other entities. Primarily, however, URLs are used to identify and locate Web pages.
A URL is composed of three parts:
Protocol. This is the first part of the address. In a Web address, the letters “http” stand for Hypertext Transfer Protocol, signifying how this request should be dealt with. The protocol information is followed by a colon. URL protocols usually take one of the following types:
- http—for accessing a Web page
- ftp—for transferring a file via FTP
- file—for locating a file on the client's own machine
- gopher—for locating a Gopher server
- mail—for submitting e-mail across the Internet
- news—for locating a Usenet newsgroup
Resource Name. This is the name of the server/machine at which the query should be directed. For an “http” request, the colon is followed by two forward slashes, and this indicates that the request should be sent to a machine.
Path and File Name. The rest of a URL specifies the particular computer name, any directory tree information, and a file name, with the latter two pieces of information being optional for Web pages. The computer name is the domain name or a variation on it (on the Web, the domain is most commonly preceded by a machine prefix “www” to identify the computer that is functioning as the organization's Web server, as opposed to its e-mail server, etc.).
If a particular file isn't located at the top level of the directory structure (as organized and defined by whoever sets up the Web site), there may be one or more strings of text separated by slashes, representing the directory hierarchy.
Finally, the last string of text to the right of the rightmost slash is the individual file name; on the Web, this often ends with the extension “htm” or “html” to signify it's an HTML document. When no directory path or file name is specified (e.g., the URL http://www.domain.com), the browser is typically pointed automatically to an unnamed (at least from the user's perspective) default or index page, which often constitutes an organization's home or start page.
Thus, a full URL with a directory path and file name may look something like this:
Lastly, a Web URL might also contain, somewhere to the right of the domain name, a long string of characters that does not correspond to a traditional directory path or file name, but rather is a set of commands or instructions to a server program or database application. The syntax of these URLs depends on the underlying software program being used. Sometimes these can function as reusable URLs (e.g., they can be bookmarked and retrieved repeatedly), but other times they must be generated by the site's server at the time of use, and thus can't be retrieved directly from a bookmark or by typing them in manually.
Spam. Commercial abuse of e-mail continues to be problematic as companies attempt to e-mail millions of online users in bulk. This technique is called “spam,” (so named after a skit by the comedy troupe Monty Python that involved the continuous repetition of the word). Online users are deluged with a massive amount of unwanted e-mail selling a wide array of products and services. Spam has become a network-wide problem as it impacts information transfer time and overall network load. Several organizations and governments are attempting to solve the spam problem through legislation or regulation.
Viruses. Computer viruses spread by e-mail have also grown as the Internet has grown. The widespread use of e-mail and the growing numbers of new, uninformed computer users has made it very easy to spread malicious viruses across the network. Security issues for both personal computers and for network servers will continue to be a crucial aspect of the ongoing development of the Internet and World Wide Web.
Intranets are private systems, contained within servers owned by companies. They are based on the same principles that govern the Internet but are not widely available; instead, they are used only for communicating and transferring company information between employees. Companies utilize intranets to protect valuable information from outside access, creating them with layers of protection in place. Because intranet systems are private, they do not suffer from some of the problems the Internet faces, such as speed-related performance issues from too many users trying to access the same sites. Companies can place multimedia presentations on their systems more easily, showing presentations and running training programs for employees.
Company uses for intranet systems are varied, including procedural manuals, employee benefit resources, orientation programs, software and hardware instructions, and even company social networks or e-zine postings. Intranets can also be constructed for a company's specific needs, tailored in functions and appearance. They can include simple files of information, such as spreadsheets or word documents. They can also incorporate search engines that employees can use to find particular components or analyze sets of data. Many also provide links to the Internet and relevant Web sites.
VoIP, or Voice over Internet Protocol, is a developing technology allowing users to access audio communication through their Internet settings. The Internet line sends voice transmissions in the form of data packets, like all other types of information stored in servers, which are then changed in audio on a receiving phone system. Users of VoIP benefit by not having to pay for separate phone and Internet services. Beyond the software and hardware required to set up VoIP, companies usually do not need to pay for more than their normal Internet service.
The most important factors in VoIP service are audio quality and accessibility. VoIP can be provided by many different companies, including CoolTalk, Vonage, and Phone Power, but companies should always be sure to conduct tests of the audio quality to ensure it is as good as normal phone service. Also, some companies may prefer to have a back-up system installed in case of emergencies, such as Internet shut-downs or power outages.
Social networks have become increasingly popular in the past few years with the rise of such Web sites as MySpace and Facebook, where Internet users can create their own profiles and structure personal Web sites in online communities. Thanks to the ease of Internet communication, participants can form friendships and spread information at a high speed across a vast area. Businesses can make use of these social networks in several ways.
Many social networks employ widgets, or embedded advertisements, often in the form of rich media. These interactive advertisements can be posted along the edges of the Web sites and can serve as both marketing and analyzing tools. By making an animated advertisement that can be clicked on or interacted with, a business can judge how attractive the advertisement is through programs designed to collect widget data. Because social networks spread information so quickly, businesses can also use them as platforms to propagate their messages and brand. Some companies have their own MySpace sites to use for marketing purposes, trying a more personal form of promotion that many social network users find honest. Other organizations are beginning to view social networks as an effective way to recruit new employees.
SMARTPHONES AND PDAS
Mobile, handheld computer devices are very common in today's business world. PDAs, which offer online interaction and note-taking abilities, are being increasingly replaced by smartphones, which are phones configured to offer the same services, including connection to the Internet, e-mail, and document programs. While many companies are eager to offer these mobile devices to their employees as a communication tool, only some are currently taking advantage of handhelds as a marketing tool. Websites can be configured to the mini-browsers smart-phones rely on, giving those using handheld devices easier access to online information and advertisements. The primary problem cited with smartphones and PDAs is security, since they are not affected by companies' intranet or Internet protections.
E-commerce can take many different forms. Some companies use a “click and mortar” system where they operate stores or factories in physical locations while also offering their products in an online store where orders can be made. Other companies have a central, physical hub and warehouses from which they conduct a large amount of business over the Internet without other bases, such as Amazon.com. Some companies exist by offering purely online services with only a central office, such as eBay.
A company's online store can be constructed to help customers personally, by keeping track of what they view, what they order, and offering similar products that they may be interested in. This is called personalization, and the ability to offer each customer their own experience every time they access the company Web site is a powerful marketing tool. It is also important for companies to consistently update their online stores to reflect their changing services or merchandise, including deals and discounts. The interface companies use is also important—how the Web site looks and reacts to customers, especially in response to searches and guided navigation.
The current quality of web cams allows companies to transfer video images in real time, letting them use the Internet to video-conference. Some companies are beginning to use video-messaging, a service that often accompanies instant messaging. This technology works for one-on-one meetings and conferences involving multiple attendees.
SEE ALSO Computer Networks; Computer Security; Electronic Commerce; Electronic Data Interchange and Electronic Funds Transfer
“Bandwidth Shaping.” Webopedia. Jupiter Media Corporation, 2008.
Berners-Lee, Tim. Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web. New York: HarperBusiness, 2000.
Chung, Joe. “The Red Queen of E-commerce.” Ecommerce Times, 2008.
“The Difference Between VoIP and PSTN Systems.” Webopedia. Jupiter Media Corporation, 2008.
Grauer, Robert, and Gretchen Marx. Essentials of the Internet. Upper Saddle River, NJ: Prentice Hall, 1997.
Hafner, Katie. Where Wizards Stay Up Late: The Origin of the Internet. New York: Simon & Schuster, 1998.
“Intranet Corner.” Intranet Journal, 2008. Available from: http://www.intranetjournal.com/articles/200107/ic_07_18_01a.html.
Kalakota, Ravi, and Andrew B. Whinston. Electronic Commerce: A Manager's Guide. Reading, MA: Addison-Wesley, 1996.
“Types of Internet Connections.” Webopedia. Jupiter Media Corporation, 2008.
"The Internet." Encyclopedia of Management. 2009. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3273100141.html
"The Internet." Encyclopedia of Management. 2009. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3273100141.html
The Internet is a vast global system of interconnected technical networks made up of heterogeneous information and communication technologies. It is also a social and economic assemblage that allows diverse forms of communication, creativity, and cultural exchange at a scope and scale unknown before the late twentieth century.
The terms Internet and net are often used when discussing the social implications of new information technologies, such as the creation of new communal bonds across great distances or new forms of wealth and inequality. Such a usage is imprecise: The Internet is distinct from the applications and technologies that are built upon it, such as e-mail, the World Wide Web, online gaming, filesharing networks, and e-commerce and e-governance initiatives. There are also many networks that are or were once distinct from the Internet, such as mobile telephone networks and electronic financial networks.
Stated more precisely, the Internet is an infrastructural substrate that possesses innovative social, cultural, and economic features allowing creativity (or innovation) based on openness and a particular standardization process. It is a necessary, but not a sufficient, condition for many of the social and cultural implications often attributed to it. Understanding the particularity of the Internet can be key to differentiating its implications and potential impact on society from the impacts of “information technology” and computers more generally.
The Internet developed through military, university, corporate, and amateur user innovations occurring more or less constantly beginning in the late 1960s. Despite its complexity, it is unlike familiar complex technical objects—for example, a jumbo jetliner—that are designed, tested, and refined by a strict hierarchy of experts who attempt to possess a complete overview of the object and its final state. By contrast, the Internet has been subject to innovation, experimentation, and refinement by a much less well-defined collective of diverse users with wide-ranging goals and interests.
In 1968 the Internet was known as the ARPAnet, named for its principal funding agency, the U.S. Department of Defense Advanced Research Projects Agency (ARPA). It was a small but extensive research project organized by the Information Processing Techniques Office at ARPA that focused on advanced concepts in computing, specifically graphics, time-sharing, and networking. The primary goal of the network was to allow separate administratively bounded resources (computers and software at particular geographical sites) to be shared across those boundaries, without forcing standardization across all of them. The participants were primarily university researchers in computer and engineering departments. Separate experiments in networking, both corporate and academic, were also under way during this period, such as the creation of “Ethernet” by Robert Metcalfe at Xerox PARC and the X.25 network protocols standardized by the International Telecommunications Union.
By 1978 the ARPAnet had grown to encompass dozens of universities and military research sites in the United States. At this point the project leaders at ARPA recognized a need for a specific kind of standardization to keep the network feasible, namely a common operating system and networking software that could run on all of the diverse hardware connected to the network. Based on its widespread adoption in the 1970s, the UNIX operating system was chosen by ARPA as one official platform for the Internet. UNIX was known for its portability (ability to be installed on different kinds of hardware) and extensibility (ease with which new components could be added to the core system). Bill Joy (who later cofounded Sun Microsystems) is credited with the first widespread implementation of the Internet Protocol (IP) software in a UNIX operating system, a version known as Berkeley Systems Distribution (BSD).
The Internet officially began (in name and in practice) in 1983, the date set by an ad hoc group of engineers known as the Network Working Group (NWG) as the deadline for all connected computers to begin using the Transmission Control Protocol and Internet Protocol (TCP/IP) protocols. These protocols were originally designed in 1973 and consistently improved over the ensuing ten years, but only in 1983 did they become the protocols that would define the Internet. At roughly the same time, ARPA and the Department of Defense split the existing ARPAnet in two, keeping “Milnet” for sensitive military use and leaving ARPAnet for research purposes and for civilian uses.
From 1983 to 1993, in addition to being a research network, the Internet became an underground, subcultural phenomenon, familiar to amateur computer enthusiasts, university students and faculty, and “hackers.” The Internet’s glamour was largely associated with the arcane nature of interaction it demanded—largely text-based, and demanding access to and knowledge of the UNIX operating system. Thus, owners of the more widespread personal computers made by IBM and Apple were largely excluded from the Internet (though a number of other similar networks such as Bulletin Board Services, BITNet, and FidoNET existed for PC users).
A very large number of amateur computer enthusiasts discovered the Internet during this period, either through university courses or through friends, and there are many user-initiated innovations that date to this period, ranging from games (e.g., MUDs, or Multi-User Dungeons) to programming and scripting languages (e.g., Perl, created by Larry Wall) to precursors of the World Wide Web (e.g., WAIS, Archie, and Gopher). During this period, the network was overseen and funded by the National Science Foundation, which invested heavily in improving the basic infrastructure of fiberoptic “backbones” in the United States in 1988. The oversight and management of the Internet was commercialized in 1995, with the backing of the presidential administration of Bill Clinton.
In 1993 the World Wide Web (originally designed by Tim Berners-Lee at CERN in Switzerland) and the graphical Mosaic Web Browser (created by the National Center for Supercomputing Applications at the University of Illinois) brought the Internet to a much larger audience. Between 1993 and 2000 the “dot-com” boom drove the transformation of the Internet from an underground research phenomena to a nearly ubiquitous and essential technology with far-reaching effects. Commercial investment in infrastructure and in “web presence” saw explosive growth; new modes of interaction and communication (e.g., e-mail, Internet messaging, and mailing lists) proliferated; Uniform Resource Locators (URLs, such as http://www.britannica.com) became a common (and highly valued) feature of advertisements and corporate identity; and artists, scientists, citizens, and others took up the challenge of both using and understanding the new medium.
The core technical components of the Internet are standardized protocols, not hardware or software, strictly speaking—though obviously it would not have spread so extensively without the innovations in microelectronics, the continual enhancement of telecommunications infrastructures around the globe, and the growth in ownership and use of personal computers over the last twenty years. Protocols make the “inter” in the Internet possible by allowing a huge number of nonoverlapping and incompatible networks to become compatible and to route data across all of them.
The key protocols, known as TCP/IP, were designed in 1973 by Vint Cerf and Robert Kahn. Other key protocols, such as the Domain Name System (DNS) and User Datagram Protocol (UDP), came later. These protocols have to be implemented in software (such as in the UNIX operating system described above) to allow computers to interconnect. They are essentially standards with which hardware and software implementations must comply in order for any type of hardware or software to connect to the Internet and communicate with any other hardware and software that does the same. They can best be understood as a kind of technical Esperanto.
The Internet protocols differ from traditional standards because of the unconventional social process by which they are developed, validated, and improved. The Internet protocols are elaborated in a set of openly available documents known as Requests for Comments (RFCs), which are maintained by a loose federation of engineers called the Internet Engineering Task Force (IETF, the successor to the Network Working Group). The IETF is an organization open to individuals (unlike large standards organizations that typically accept only national or corporate representatives) that distributes RFCs free of charge and encourages members to implement protocols and to improve them based on their experiences and users’ responses. The improved protocol then may be released for further implementation.
This “positive feedback loop” differs from most “consensus-oriented” standardization processes (e.g., those of international organizations such as ISO, the International Organization for Standardization) that seek to achieve a final and complete state before encouraging implementations. The relative ease with which one piece of software can be replaced with another is a key reason for this difference. During the 1970s and 1980s this system served the Internet well, allowing it to develop quickly, according to the needs of its users. By the 1990s, however, the scale of the Internet made innovation a slower and more difficult procedure—a fact that is most clearly demonstrated by the comparatively glacial speed with which the next generation of the Internet protocol (known as IP Version 6) has been implemented.
Ultimately, the IETF style of standardization process has become a common cultural reference point of engineers and expert users of the Internet, and has been applied not only to the Internet, but also to the production of applications and tools that rely on the Internet. The result is a starkly different mode of innovation and sharing that is best exemplified by the growth and success of so-called “free software” or “open-source software.” Many of the core applications that are widely used on the Internet are developed in this fashion (famous examples include the Linux operating system kernel and the Apache Web Server).
As a result of the unusual development process and the nature of the protocols, it has been relatively easy for the Internet to advance around the globe and to connect heterogeneous equipment in diverse settings, wherever there are willing and enthusiastic users with sufficient technical know-how. The major impediment to doing so is the reliability (or mere existence) of preexisting infrastructural components such as working energy and telecommunications infrastructures. Between 1968 and 1993 this expansion was not conducted at a national or state level, but by individuals and organizations who saw local benefit in expanding access to the global network. If a university computer science department could afford to devote some resources to computers dedicated to routing traffic and connections, then all the researchers in a department could join the network without needing permission from any centralized state authority. It was not until the late 1990s that Internet governance became an issue that concerned governments and citizens around the world. In particular, the creation of the Internet Corporation for Assigned Names and Numbers (ICANN) has been the locus of fractious dispute, especially in international arenas. ICANN’s narrow role is to assign IP numbers (e.g., 192.168.0.1) and the names they map to (e.g., www.wikipedia.org), but it has been perceived, rightly or wrongly, as an instrument of U.S. control over the Internet.
With each expansion of the Internet, issues of privacy, security, and organizational (or national) authority have become more pressing. At its outset the Internet protocols sought to prioritize control within administrative boundaries, leaving rules governing use to the local network owners. Such a scheme obviated the need for a central authority that determined global rules about access, public/private boundaries, and priority of use. With the advent of widespread commercial access, however, such local control has been severely diluted, and the possibility for individual mischief (e.g., identity theft, spam, and other privacy violations) has increased with increasing accessibility.
On the one hand, increased commercial access means a decline in local organized authority over parts of the Internet in favor of control of large segments by Internet Service Providers (ISPs) and telecommunications/cable corporations. On the other hand, as the basic infrastructure of the Internet has spread, so have the practices and norms that were developed in concert with the technology—including everything from the proper way to configure a router, to norms of proper etiquette on mailing lists and for e-mail. Applications built on top of the Internet have often adopted such norms and modes of use, and promoted a culture of innovation, of “hacking” (someone who creates new software by employing a series of modifications that exploit or extend existing code or resources, with good or bad connotations depending on the context), and of communal sharing of software, protocols, and tools.
It is thus important to realize that although most users do not experience the Internet directly, the development of the particular forms of innovation and openness that characterize the Internet also characterize the more familiar applications built on top of it, due to the propagation of these norms and modes of engineering. There is often, therefore, a significant difference between innovations that owe their genesis to the Internet and those developed in the personal computer industry, the so-called “proprietary” software industry, and in distinct commercial network infrastructures (e.g., the SABRE system for airline reservations, or the MOST network for credit card transactions). The particularity of the Internet leads to different implications and potential impact on society than the impacts of “information technology” or computers more generally.
One of the most widely discussed and experienced implications of the Internet is the effect on the culture industries, especially music and film. As with previous media (e.g., video and audio cassette recorders), it is the intersection of technology and intellectual property that is responsible for the controversy. Largely due to its “openness,” the Internet creates the possibility for low-cost and extremely broad and fast distribution of cultural materials, from online books to digital music and film. At the same time, it also creates the possibility for broad and fast violation of intellectual property rights—rights that have been strengthened considerably by the copyright act of 1976 and the Digital Millennium Copyright Act (1998).
The result is a cultural battle over the meaning of “sharing” music and movies, and the degree to which such sharing is criminal. The debates have been polarized between a “war on piracy” on the one hand (with widely varying figures concerning the economic losses), and “consumer freedom” on the other—rights to copy, share, and trade purchased music. The cultural implication of this war is a tension among the entertainment industry, the artists and musicians, and the consumers of music and film. Because the openness of the Internet makes it easier than ever for artists to distribute their work, many see a potential for direct remuneration, and cheaper and more immediate access for consumers. The entertainment industry, by contrast, argues that it provides more services and quality—not to mention more funding and capital—and that it creates jobs and contributes to a growing economy. In both cases, the investments are protected primarily by the mechanism of intellectual property law, and are easily diluted by illicit copying and distribution. And yet, it is unclear where to draw a line between legitimate sharing (which might also be a form of marketing) and illegitimate sharing (“piracy,” according to the industry).
A key question about the Internet is that of social equity and access. The term digital divide has been used primarily to indicate the differential in individual access to the Internet, or in computer literacy, between rich and poor, or between developed and developing nations. A great deal of research has gone into understanding inequality of access to the Internet, and estimates of both differential access and the rate of the spread of access have varied extremely widely, depending on methodology. It is, however, clear from the statistics that between 1996 and 2005 the rate of growth in usage has been consistently greater than 100 percent in almost all regions of the globe at some times, and in some places it has reached annual growth rates of 500 percent or more. Aside from the conclusion that the growth in access to the Internet has been fantastically rapid, there are few sure facts about differential access.
There are, however, a number of more refined questions that researchers have begun investigating: Is the quantity or rate of growth in access to the Internet larger or smaller than in the case of other media (e.g., television, print, and radio)? Are there significant differences within groups with access (e.g., class, race, or national differences in quality of access)? Does access actually enhance or change a person’s life chances or opportunities?
The implication of a digital divide (whether between nations and regions, or within them) primarily concerns the quality of information and the ability of individuals to use it to better their life chances. In local terms, this can affect development issues broadly (e.g., access to markets and government, democratic deliberation and participation, and access to education and employment opportunities); in global terms, differential access can affect the subjective understandings of issues ranging from religious intolerance to global warming and environmental issues to global geopolitics. Digital divides might also differ based on the political situation—such as in the case of the Chinese government’s attempt to censor access to politicized information, which in turn can affect the fate of cross-border investment and trade.
SEE ALSO Information, Economics of; Media; Microelectronics Industry; Property Rights, Intellectual
Abbate, Janet. 1999. Inventing the Internet. Cambridge, MA: MIT Press.
Castells, Manuel. 2001. The Internet Galaxy: Reflections on the Internet, Business, and Society. Oxford: Oxford University Press.
DiMaggio, Paul, Eszter Hargittai, Coral Celeste, and Steven Shafer. 2004. Digital Inequality: From Unequal Access to Differentiated Use. In Social Inequality, ed. Kathryn Neckerman, 355–400. New York: Russell Sage Foundation.
International Telecommunication Union. ICT Indicators. http://www.itu.int/ITU-D/ict/.
Meuller, Milton. 2004. Ruling the Root: Internet Governance and the Taming of Cyberspace. Cambridge, MA: MIT Press.
National Telecommunications and Information Administration. A Nation Online: Entering the Broadband Era. http://www.ntia.doc.gov/reports/anol/index.html.
Norberg, Arthur L., and Judy E. O’Neill. 1996. Transforming Computer Technology: Information Processing for the Pentagon, 1962–1986. Baltimore: Johns Hopkins University Press.
Pew Internet and American Life Project. http://www.pewinternet.org.
Schmidt, Susanne K., and Raymund Werle. 1997. Coordinating Technology: Studies in the International Standardization of Telecommunications. Cambridge, MA: MIT Press.
Waldrop, M. Mitchell. 2001. The Dream Machine: JCR Licklider and the Revolution That Made Computing Personal. New York: Viking Penguin.
Weber, Steven. 2004. The Success of Open Source. Cambridge, MA: Harvard University Press.
Christopher M. Kelty
"Internet." International Encyclopedia of the Social Sciences. 2008. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3045301168.html
"Internet." International Encyclopedia of the Social Sciences. 2008. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3045301168.html
The Internet is a vast network that connects many independent networks and links computers at different locations. It enables computer users throughout the world to communicate and to share information in a variety of ways. Its evolution into the World Wide Web made it easy to use for those with no prior computer training.
The Internet could not exist until the modern computer came to be. The first electronic computers were developed during the 1940s, and these early machines were so large—mainly because of all the bulky vacuum tubes they needed to perform calculations—that they often took up an entire room by themselves. They were also very expensive, and only a few corporations and government agencies could afford to own one. The decade of the 1950s proved to be one of silent conflict and tension between the Soviet Union and the United States—a period called the "cold war"—and computers naturally came to play a large role in those nations' military planning. Since each country was obsessed with the possibility of a deliberate or accidental nuclear war breaking out, the United States began to consider how it might protect its valuable lines of communication in case such a disaster did occur. By the 1960s, both nations had become increasingly dependent on their rapidly-improving computing technologies, and the United States eventually developed a means of linking its major defense-related computer facilities together (to form a network). In 1969, the U.S. Department of Defense began a network of university and military computers that it called ARPANET (Advanced Research Projects Agency Network).
Words to Know
HTML: HyperText Markup Language, used in writing pages for the World Wide Web; it lets the text include codes that define font, layout, embedded graphics, and hypertext links.
HTTP: HyperText Transfer Protocol, which is the way World Wide Web pages are transferred over the Internet.
Hypertext: System of writing and displaying text that enables the text to be linked in multiple ways, to be available on several levels of detail, and to contain links to related documents.
Links: Electronic connections between pieces of information.
Network: A system made up of lines or paths for data flow that includes nodes where the lines intersect and where the data can flow into different lines.
Packets: Small batches of data that computers exchange.
Protocols: Rules or standards for operations and behavior.
World Wide Web: A hypermedia system that is a graphical map for the Internet, that is simple to understand, and that helps users navigate around Internet sites.
The major characteristic of ARPANET was the way it used the new idea called "packet switching." What this does is break up data—or information to be transmitted from one computer to another—into pieces or "packets" of equal-size message units. These pieces or packets are then sent separately to their destination where they are finally reassembled to reform the complete message. So by "packet switching" data, a message is sent in pieces or segments, each of which may travel a different route to the same destination, where it is eventually put back together, no matter how or which way it got there. For defense purposes, this system seemed ideal since if there were any working path to the final destination, no matter how indirect, the new network would find it and use it to get the message through. In 1970, ARPANET began operations between only four universities, but by the end of 1971, ARPANET was linking twenty-three host computers.
How computers could talk to one another
As this system slowly grew, it became apparent that eventually the computers at each different location would need to follow the same rules and procedures if they were to communicate with one another. In fact, if they all went their separate ways and spoke a different "language" and operated under different instructions, then they could never really be linked together in any meaningful way. More and more, the scientists, engineers, librarians, and computer experts who were then using ARPANET found that the network was both highly complex and very difficult to use. As early as 1972, users were beginning to form a sort of bulletin board for what we now call e-mail (electronic mail). This made the need for common procedures even more obvious, and in 1974, what came to be called a common protocol (pronounced PRO-tuh-call) was finally developed. Protocols are sets of rules that standardize how something is done so that everyone knows what to do and what to expect—sort of like the rules of a game. This common language was known as a Transmission Control Protocol/Internet Protocol (TCP/IP).
The development of this protocol proved to be a crucial step in the development of a real, working network since it established certain rules or procedures that eventually would allow the network really to expand. One of the keys of the protocol was that it was designed with what was called "open architecture." This meant that each network would be able to work on its own and not have to modify itself in any way in order to be part of the network. This would be taken care of by a "gateway" (usually a larger computer) that each network would have whose special software linked it to the outside world. In order to make sure that data was transmitted quickly, the gateway software was designed so that it would not hold on to any of the data that passed through it. This not only sped things up, but it also removed any possibility of censorship or central control. Finally, data would always follow the fastest available route, and all networks were allowed to participate.
In practice, the new TCP/IP set up a system that is often compared to a postal system. The information being sent or the "data packets" would have headers just as a letter has an address on its envelope. The header would therefore specify where it came from and what its destination was. Just as everyone's postal rules (protocols) state that all mail must be in an envelope or some sort of package and that it must have postage and a destination address, so TCP/IP said that every computer connected to the network must have a unique address. When the electronic packet was sent to the routing computer, it would sort through tables of addresses just as a mail sorter in a post office sorts through zip codes. It would then select the best connection or available route and send it along. On the receiving end, the TCP/IP software made sure all the pieces of the packet were there and then it put them back together in proper order, ready to be used. It makes no difference (other than speed) to the network how the data was transmitted, and one computer can communicate with another using regular phone lines, fiber-optic cables, radio links, or even satellites.
Personal computers and domain names
All of this took some time, but by the beginning of 1983, when the TCP/IP was ready to go and finally adopted, the Internet—or a network of networks—was finally born. To this point, most of the business on the "Net," as it came to be called, was science-oriented. About this same time, however, the microcomputer revolution was also starting to be felt. Called "personal computers," these new, smaller, desktop-size computers began slowly to enter businesses and homes, eventually transforming the notion of what a computer was. Until this time, a computer was a very large, super-expensive, anonymous-looking machine (called a "mainframe") that only corporations could afford. Now however, a computer was a friendly, nearly-portable, personal machine that had a monitor or screen like a television set. As more and more individuals purchased a personal computer and eventually learned about a way of talking to another computer (via e-mail), the brand-new Internet soon began to experience the problems of its own success.
By 1984, it was apparent that something had to be done to straighten out and simplify the naming system for each "host" computer (the host was the "server" computer that was actually linked to the Internet). That year, the system called "Domain Name Servers" was created. This new system organized Internet addresses into various "domains" or categories—such as governmental (.gov), commercial (.com), educational (.edu), military (.mil), network sites (.net), or international organizations (.org)—that were tacked onto the end of the address. Host or server names now were not only much easier to remember, but the alphabetical addresses themselves actually stood for a longer coded sequence of numbers that the computer needed in order to specifically identify an address. Thus, a person needed only to use a fairly short alphabetical address, which itself contained the more complex numerical sequence. By 2001, however, an entire batch of additional domain names (.biz, .info, .name,.museum, .aero, .coop, and .pro) had to be created to account for the increase in both specialization and use. This domain expansion is similar to the phone company issuing new area codes.
By the mid-1980s, a second, larger network had grown up in the United States, and it would eventually absorb ARPANET. The National Science Foundation established its own cross-country network, called NSFNET, in order to encourage increased network communication by colleges
and universities. NSFNET adopted the TCP/IP rules, but it did not allow its system to be used for non-educational purposes. This policy proved to be very important since it eventually led businesses to create networks of their own, and also encouraged several private "providers" to open for business. In 1987, the first subscription-based commercial Internet company, called UUNET, was founded. As the end of the 1980s approached, the Internet was growing, but it was still not the place for a beginner. The main problem was that every time users wanted to do something different on it (such as e-mail or file transfer), they had to know how to operate an entirely separate program. Commands had to be either memorized or reference manuals had to be constantly consulted. The Internet was not "user-friendly."
World Wide Web
The development of what came to be called the World Wide Web in 1991 marked the real breakthrough of the Internet to a mass audience of users. The World Wide Web is really a software package that was based on "hypertext." In hypertext, links are "embedded" in the text (meaning that certain key words are either underlined or appear in a contrasting sdifferent color) that the user can then click on with a mouse to be taken to another site containing more information. It was the development of the Web that made usage of the Internet really take off, since it was simple to understand and use and enabled even new users to be able to explore or "surf" the Net. Without the World Wide Web, the Internet probably would have remained a mystery to those huge numbers of people who either had no computer expertise or wanted any computer training.
The Web developed a new set of rules called HTTP (HyperText Transfer Protocol) that simplified address writing and that used a new programming language called HTML (HyperText Markup Language). This special language allowed users easily to jump (by clicking on a link) from one document or information resource to another. In 1993, the addition of the program called Mosaic proved to be the final breakthrough in terms of ease-of-use. Before Mosaic, the Web
was limited only to text or words. However, as a "graphical browser," the Mosaic program included multimedia links, meaning that a user could click on icons (pictures of symbols) and view pictures, listen to audio, and even see video. By 1995, with the addition of sound and graphics and the emergence of such large commercial providers as America Online (AOL), Prodigy, and Compuserv, interest and usage of the Internet really took off.
By the beginning of the twenty-first century, the Internet had become a vast network involving millions of users connected by many independent networks spanning over 170 countries throughout the world. People use it to communicate (probably the most popular use), and hundreds of millions of e-mail messages electronically fly across the globe every day. People also use it as they would a library, to do research of all types on all sorts of subjects. On almost any major subject, a user can find text, photos, video, and be referred to other books and sources. The
Internet also has commercial possibilities, and users can find almost any type of product being sold there. A person with a credit card can book an airline flight, rent a beach home and car, reserve tickets to a performance, and buy nearly anything else he or she desires. Some businesses benefit from this more than others, but there is no dismissing the fact that the Internet has changed the way business is conducted.
Used daily for thousands of other reasons, the Internet is many things to many people. It is a world-wide broadcasting medium, a mechanism for interacting with others, and a mechanism for obtaining and disseminating information. Today, the Internet has become an integral part of our world, and most would agree that its usefulness is limited only by our imagination.
[See also Computer software ]
"Internet." UXL Encyclopedia of Science. 2002. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3438100376.html
"Internet." UXL Encyclopedia of Science. 2002. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3438100376.html
Traditionally, death has been a great taboo in Western culture, a topic delicately sidestepped in polite public company and private reflection alike. But since 1995, the taboo has been at least partially dispelled in the informational glut of the Internet, which has brought the subject of death within easy arm's reach of millions of the previously averse or oblivious—merely typing in the letters "d-e-a-t-h" in the window of a search engine (i.e., www.google.com) yields no fewer than 23,600,000 items, enough to daunt even the most avid scholar or morbid connoisseur of mortality.
However, these web sites provide far more than mere information: There is practical help in the form of bereavement support and information on organ donation and living wills, death in cultures around the world, hospice care, and numerous other areas.
Some of the most useful sites guide the web surfer toward services as well as information. One such site is www.excite.com/family/family_in_crisis, which lists numerous links to social and medical services and information for those burdened with grief or terminal illness. The site lists links to other sites regarding euthanasia, suicide, estate planning, and many other related topics. Those with more theoretical concerns might profitably consult www.tripod.lycos.com. There, the student, teacher, or researcher can find additional links to a wealth of other informational sites.
Because search engines often yield a dizzying plethora of responses, it is useful to narrow the range of responses by making the topic as specific as possible. For example, instead of merely typing in "grief," one might add "AND" plus another word to limit the search—say, "children's." Then only topics pertaining to children's grief will appear on the list of responses, saving the searcher a good deal of time and effort by reducing the number of items to several dozen rather than several thousand.
Another important watchword for web surfing on this or any other topic is "vigilance," a critical tool in distinguishing between the trustworthiness of a site produced by a distinguished scholar, such as Michael Kearl, and a personal site titled "Buffy's Death Page." "Caveat emptor" should be the watchword for every Internet surfer, where triviality and fraud are as common as the authentic and rewarding.
Demographics of Death on the Web
A number of web sites specialize in a statistical approach to death—its causes and demographics, life expectancies, social factors, and so on. The data on these sites are updated frequently and are usually culled from reliable government and scholarly sources. Some such sites are devoted to particular segments of society. For example, www.runet.edu provides information on life expectancy for African Americans compared to whites, along with other health-related data. Government sites, such as www.cdc.gov/nchs, give a broader range of data for many different segments of American society, including major causes of death in various age groups.
In other sites the accent is on the individual—for example, by entering a name, place of death, or Social Security Number at www.vitalrec.com, one can locate the death record of anyone in the United States. This site also provides links to sites that yield overseas records as well.
Cross-Cultural and Religious Information
For those interested in the religious dimension of death and dying, there is a wealth of sites that provide access to information on the death rituals, funeral customs, and mourning practices of nearly every known religion or cult, major or minor. Other sites dwell on a more broadly cultural approach to the meaning of death and attitudes toward the dying—a site might be devoted to a single culture such as that of the Cree Indians (www.sicc.sk.ca), while others might explore a broad range of cultures. One of the best is found at www.encarta.msn.com. Sites such as these also provide links to related web sites, as well as to printed material and reading lists.
Grief and Bereavement
The most numerous death-related web sites are those that deal with grief, both as a subject of analysis and as a topic for practical guidance to coping. The Griefnet web site (www.griefnet.org) provides one of the most extensive support systems online. It includes several web pages and over thirty small e-mail support groups. Griefnet posts a companion site for children and parents.
Some sites are designed to deal with specific categories of grievers. The Australian Widownet site (www.grief.org.au) provides information and self-help resources for widows and widowers of all ages, religious backgrounds, and sexual orientations. Suicide often evokes special issues of grief. One particular site that includes personal testimony by those who have experienced the death of a loved one by suicide is www.1000deaths.com. Tragedy Assistance Program for Survivors, Inc., a nonprofit group that provides support to those who have lost a loved one who met his or her end while serving in the armed forces, can be found at www.taps.org. The site provides peer support, crisis information, a variety of resources, and the opportunity to establish a virtual memorial.
Other bereavement web sites provide information not only for the bereaved but also for the professionals who are a part of the death system. Genesis Bereavement Resources (www.genesisresources.com) provides a list of music, videos, and other material that may be helpful to grievers, health care professionals, funeral directors, and pastors.
No detail is too slight or awkward to escape the attention of web entrepreneurs. Bereavement Travel at www.bereavementtravel.com allows one to make travel arrangements at the time of death at the special bereavement rates offered by many airlines and hotels. This service is primarily a convenience for the bereaved.
There are also special sites dedicated to unique bereavement responses, including www.aidsquilt.org/Newsite, which provides information on the AIDS quilt that has been shown all over the United States as a memorial to victims of the illness. In addition to bereavement support, some sites offer guidance on life-threatening illnesses, such as www.cancer.org for the American Cancer Society and www.alz.org for the Alzheimer's Disease and Related Disorders Association.
Compassionate Friends, the best known of the national bereavement support groups for parents who have experienced the death of a child, has a web site at www.compassionatefriends.org. Here, one can locate local chapters, obtain brochures, form a local chapter, and catch up with the latest related news. There are also organizations that help visitors locate or start a grief support group.
Finally, there are sites for many well-known organizations that are part of the thanatology field. The Make-A-Wish Foundation (www.wish.org) fulfills special wishes for terminally ill children. They send children to theme parks, arrange meetings or phone calls with celebrities, and perform other special services for ill children.
Bereavement guidance on the web is not limited to those who have suffered the loss of human companions. Those dealing with the loss of a pet may go to the web site for the Association for Pet Loss and Bereavement at www.aplb.org. One of the most unique sites in this area is www.petloss.com, which provides online grief support and describes a special candle ceremony held weekly to commemorate the death of a pet. Also at this site, one can find reference to other related web sites, chat rooms, and telephone support.
The web offers a range of end-of-life issues, including care of the terminally ill, living wills, and hospice care. Choice in Dying (www.choices.org) is the organization that first devised a living will in 1967, long before states adopted a legal policy on this issue. This nonprofit organization and its web site provide counseling for patients and families, information on advanced directives, outline training resources for professionals, and serve as an advocate for improved laws. The American Institute of Life-Threatening Illnesses, a division of the Foundation for Thanatology, can be found at www.lifethreat.org. This organization, established in 1967, is dedicated to promoting improved medical and psychosocial care for critically ill patients and their families.
People have long complained about the high cost of funerals and related expenses. There are numerous web sites that offer online casket purchases and other related items. Such sites promise quick service and complete satisfaction, often at steep discounts. In addition to caskets, www.webcaskets.com offers urns, markers, flowers, and other funerary items. At www.eternalight.com one can purchase an "eternal" light, guaranteed to glow for thirty years. The light can be used at home as a permanent memorial to the loved one. The site donates 10 percent of the purchase price to a national support group of the customer's choice.
It is possible to plan an entire funeral service online at www.funeralplan.com. One can actually watch a funeral service from many funeral homes by going to www.funeral-cast.com. The National Funeral Directors Association maintains a site at www.nfda.org. Here, one can locate funeral homes, obtain consumer information, and learn about careers in this field.
Some web sites defy easy classification. One popular site is www.deathclock.com. Here one can plug in one's date of birth and gender, along with one's attitudinal and philosophical propensities, and obtain the likely date of one's demise. Visitors can watch the clock count down their time on Earth. Many college students find this to be a fascinating site and download a screen-saver version—every time they turn on their computers they watch their lives "tick away." Other interesting sites include www.1800autopsy.com, where one can contact a mobile company to perform such an examination, and www.autopsyvideo.com, which allows visitors to view autopsies online. These web sites are used by professionals and educators, as well as the curious.
The web is aswarm with jokes on all topics, and death is no exception. Some web pages specialize in bad-taste death jokes, many of which center on celebrities. One site in particular allows the visitor to "bury or cremate" someone. After entering a name and choosing a method of body disposal, one can watch as the casket burns up.
Obituaries and Last Words
Numerous web sites provide visitors with the opportunity to post memorial messages. Most of these sites charge a fee for permanent placement. At www.legacy.com, one can pay a fee of $195 to place a memorial, including photograph, on the site.
Memorialtrees.com arranges for a memorial tree to be planted in any state in the United States or in the Canadian provinces. The fee is less than thirty dollars and includes a certificate of planting and a card that Memorialtrees.com sends to the survivor.
Much attention has been paid to the issue of the near-death experience. Two sites that are particularly useful include www.iands.org, the official web site of the International Association for Near-Death Studies, replete with research information, case studies, and resources; and www.neardeath.com, which includes near-death experiences of people of various faiths along with the testimony of children and suicides who have had brushes with death.
Legal and Financial Issues
A number of sites offer guidance in the many practical and financial matters that arise after a death. One very comprehensive site is www.moneycentral.msn.com. Here, one can find answers to general questions regarding finances, collecting life insurance, and handling bills of the deceased. One can also obtain information on making a will without consulting an attorney. A site like www3.myprimetime.com includes information on estates as well as the impact of being a griever and executor.
The Internet has dramatically expanded the availability of resources in the field of thanatology, providing both useful and irrelevant sites. Anyone consulting web sites must be careful to sort through them to find those that are helpful and accurate.
See also: Death Education; Grief: Overview; Grief Counseling and Therapy; Last Words; Memorial, Virtual; Near-Death Experiences; Technology and Death
"Funeral Rites and Customs." In the Encarta [web site]. Available from www.encarta.msn.com.
Radford University. "Sociological Comparisons between African-Americans and Whites." In the Radford University [web site]. Available from www.runet.edu/-junnever/bw.htm
DANA G. CABLE
CABLE, DANA G.. "Internet." Macmillan Encyclopedia of Death and Dying. 2003. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3407200157.html
CABLE, DANA G.. "Internet." Macmillan Encyclopedia of Death and Dying. 2003. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3407200157.html
An internet is a collection of interconnected computers that use networking hardware and software to send and receive data. The Internet is the global network of inter-connected computers and servers available to the public. The World Wide Web is the collection of graphically intensive Web pages that have enabled the Internet to become a societal phenomenon.
THE ORIGINAL INTERNET
In the 1950s researchers and scientists across the country linked their mainframe computers via telephone connections operating at very slow speeds. This first network supported communication of basic text-based computer data. In the beginning, only federal agencies and a few research universities were linked. The system was funded by the Advanced Research Project Agency, a technology and research group in the U.S. Department of Defense. The system was referred to as ARPANET.
The first four universities connected to ARPANET were Stanford University, the University of California-Los Angeles, the University of California-Santa Barbara, and the University of Utah. Communications research in the 1960s led to decentralized networks, queuing theory, and packet switching. These technologies allowed different types of computers to send and receive data. Computers transmitted information in a standardized protocol called packets. The addressing information in these packets told
each computer in the system where the packet was supposed to go.
In 1972 the first electronic mail (e-mail) program was developed. It used file transfer protocol (FTP) to upload messages to a server that would then route the message to the intended computer terminal. This text-based communication tool greatly affected the rate at which collaborative work could be conducted between researchers at participating universities. This collaboration led to the development of the transmission control protocol (TCP), which breaks large amounts of data into packets of a fixed size, transmits the packets over the Internet using the Internet protocol (IP), and sequentially numbers them to allow reassembly at the recipient's end. The combination of TCP and IP is still the model used to move data over the Internet.
In 1984 the Pentagon, the leadership of the U.S. military, decided the growing academic and community-based Internet was far too open and lacked the security required for a military network. They transferred control of the original ARPANET to the National Science Foundation (NSF) and created a separate and secure network called MILNET. The NSF added a network backbone, renamed it NSFNet and made it available to a much larger number of colleges and universities.
With more universities connected and participating in the Internet, more programs and communication applications were created. A program called Telnet allowed remote users to run programs and computers from other sites. Gopher, developed at the University of Minnesota and named after the university's mascot, allowed menu-driven access to data resources on the Internet. Search engines such as Archie and Wide Area Index Search gave users the ability to search the Internet's numerous libraries and indexes. By the mid-1980s users at universities, research laboratories, private companies, and libraries were empowered by the new networking revolution. More than 30,000 host computers and modems were actively using the Internet.
THE INTERNET AND THE WORLD WIDE WEB
In August 1991, Dr. Tim Berners-Lee (1955– ) of CERN (the European Organization for Nuclear Research) in Switzerland envisioned the concept of a graphical, page-based Internet—the World Wide Web. Although many people use the terms Internet and World Wide Web inter-changeably, they refer to two separate, yet related, technologies. The Web is supported by hypertext markup language (HTML), a programming language used to create graphical Web pages, and hypertext transfer protocol (HTTP), the routing technology used to identify uniform resource locators (URLs) or Web page addresses.
Web pages are retrieved via Internet protocols and resources; the Web, however, is merely one of many Internet applications such as FTP, Telnet, and Gopher. Berners-Lee developed the Web as a way to simplify reading the location of documents by assigning standard names or file paths. In 1992 the first Web browsers, Viola and Mosaic, were developed. The ease of use and graphic capabilities (prior Internet data exchanges were primarily text-based) made Web browsers popular outside the academic community, and soon the general public found access to the Internet and World Wide Web to be useful.
The Internet and the World Wide Web continue to grow. The U.S. Census Bureau reported that in 2003, 61.8 percent of U.S. households had a computer and 54.7 percent had Internet access. Home use, however, does not reflect the number of people who use computers and the Internet at work, in libraries, at schools, and in community organizations. The Census Bureau found that nearly 60 percent of American adults used the Internet. Over 165 countries are connected to the Internet. Yet, no one nation or group operates or controls the Internet. Although there are entities that oversee the system, "no one is in charge." This allows for a free transfer and flow of information throughout the world. Search engines such as Google and Yahoo index the Web to help in the organization and retrieval of information.
USING THE INTERNET AND WORLD WIDE WEB
Accessing the Internet requires an Internet-capable computer and a modem to modulate/demodulate outgoing and incoming data packets. Modems connect computers to the Internet across telephone lines (dial-up) or by optical or wire cable (broadband or digital subscriber line, also known as DSL). The connection is provided by an Internet service provider (ISP), such as America Online, Com-cast, or RoadRunner. For a monthly fee, these companies provide access to the Internet, e-mail, a certain amount of storage, and search utilities. These Internet providers will often offer portal sites that provide a Web browser, a chat service (Internet relay chat—IRC), instant messaging (IM), bulletin boards, newsgroups, and forums.
Each application requires a specific software program. Many computers are sold with these applications preloaded, such as Microsoft's Internet Explorer, the most popular Web browser. E-mail applications such as Eudora are purchased separately; many e-mail programs, however, are now Web-based. This means that users can access their Web-based e-mail program from any computer that is connected to the Internet. A specific software application is no longer required because the application runs from the server rather than from the computer itself.
All ISPs require a username and password, which establishes the user's identity and gives authorization to use the Internet service. The Internet service provider has its own higher-order identity on the Internet, known as a domain. For example, in the following e-mail address:
the first part of the address, "jones" identifies the user; this is the username. The "@" (pronounced "at") separates the username from the domain. In this example, "abc" is the domain name, and ".com" is the extension that identifies the entity as a commercial provider. Other extensions include .net for network, .edu for education, .mil for military, .gov for government, and .org for organization.
Affect on Business and Industry
The World Wide Web has created a new industry segment called electronic commerce (e-commerce). Businesses sell to other businesses (B2B) and to consumers (B2C) on the Internet using secure Web sites. The "dot.com" frenzy came to a head in the late 1990s when the number of online companies exceeded demand. Although online commerce declined slightly, it has remained stable since then. Strong e-commerce providers are either "pure-play" (having only an Internet presence, such as eBay and Amazon.com) or "brick-and-click" (having both a physical store as well as an online store, such as Wal-Mart, Sears, and most other major retail outlets).
Internet technology has also had an impact on business and industry by supporting telecommuting. Rather than commuting to work, employees work from home via telecommunications (e.g., e-mail, video streaming, and online portals). Overhead costs are lowered if office space and equipment can be reduced, and the flexibility for the employee can be a benefit.
Additionally, Internet use has changed the face of education. Nearly every school in the United States has computer technology and Internet access. Students use Web browsers to search for information, teachers use online databases to access lesson plans and learning resources, and schools build Web sites that provide homework information, school calendars, and other important information for parents, faculty, and students.
Distance learning or online education has also made great strides. High schools, colleges, universities, and for-profit providers are supplementing their face-to-face classes with Web-based learning environments, such as Blackboard, WebCT, and e-College. Students can down-load activities, participate in synchronous chat groups or asynchronous discussion forums, work collaboratively with other students on group projects, take tests, and post their homework for evaluation. Some courses are offered totally online without any face-to-face interaction between the student and instructor.
Changes in Information Transfer and Communication
The Internet is one of the most innovative and productive technologies in history. The Internet can send information from virtually any place on the globe to any other place in seconds. This communication tool has dramatically changed the concept of the "speed of business." In effect, the Internet has created a sense of time compression. No longer do large documents need to be mailed by expensive overnight carriers. Electronic files are sent as e-mail attachments in seconds or documents can be posted to Web sites where they can be downloaded by thousands of recipients. Distribution has also been affected. Rather than mailing 1,000 newsletters to an organization's membership, Listservs enable the message to be sent to one address. The message is sent to the Listserv address (e.g., "email@example.com"), and anyone who has signed up or been added to the Listserv instantly receives the information.
A very popular new Web-based communication tool is the Weblog (or "blog"). Used by both companies and individuals, blogs are diaries posted to a host site that can be accessed by anyone. Some commercial blogs are designed for customer use. They offer free product advice, technical assistance, drivers and downloads, and product data to attract new customers. Microsoft's product developers use blogs to encourage interest in their work. In some cases, readers can post comments to forums, which the blogger monitors.
The ease of use and instantaneous communication of the Internet are generally seen as significant enhancements to society, but there are some negative aspects. The term CyberEthics refers to the ethical use of the Internet. For example, music or movie files are easily copied from compact disks or downloaded from file-sharing and peer-to-peer sites such as BearShare, e-donkey, Napster, and Kazaa. The Recording Industry Association of America (RIAA) attempts to combat piracy—the illegal duplication and distribution of any recording—via lawsuits and fines. The RIAA reported that worldwide, the industry was losing $4.2 billion to piracy each year.
The personal computer will continue to evolve, but experts predict other Internet-smart appliances would become standard. Wristwatches will provide Internet access and support computer applications such as Word. Televisions will anticipate viewers' program preferences and record shows it thinks they may like. Kitchen appliances will be programmed by Internet-based command centers that will download recipes, inventory current ingredients (how much milk is left?), and print shopping lists. Like the explorers who discovered new continents, Internet users are just beginning to discover the full impact of the medium on information, space, and time.
see also Electronic Commerce; Electronic Mail; Intranet/Extranet
Recording Industry Association of America. http://www.riaa.org
Lisa E. Gueldenzoph
Mark J. Snyder
Gueldenzoph, Lisa; Snyder, Mark. "Internet." Encyclopedia of Business and Finance, 2nd ed.. 2007. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-1552100183.html
Gueldenzoph, Lisa; Snyder, Mark. "Internet." Encyclopedia of Business and Finance, 2nd ed.. 2007. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-1552100183.html
█ JUDSON KNIGHT
The Internet is a vast worldwide conglomeration of linked computer networks. Its roots lie in the mid-twentieth century, with a number of projects by the United States government and the private sector, most notable of which was the computer network created by the Advanced Research Projects Agency (ARPA) of the Department of Defense (DOD) in 1969. Until the early 1990s, the Internet remained largely the province of specialists, including defense personnel and scientists. The creation of browsers, or software that provided a convenient graphical interface between user and machine, revolutionized the medium, and spawned rapid economic growth throughout the 1990s. In addition to the World Wide Web and e-mail, the parts of the Internet most familiar to casual users, the Internet
contains a frontier that offers both great promise and great challenges to law and security.
Birth of the Internet
The basis of the Internet is the network, a group of computers linked by communication lines. The distant ancestors of today's networks were highly specialized systems used either by DOD, or by private companies (for example, airlines, which tracked reservations on the SABRE system) during the late 1950s and early 1960s. The development of semiconductor technology in the 1960s enabled the growth of computer activity in general, and networking in particular. Universities and research centers participated in timesharing, whereby multiple users accessed the same system.
ARPANET, which connected time-sharing facilities at research centers, is generally regarded as the first true computer network. It provided a testing-ground for technologies that are still used today: simple mail transfer protocol (SMTP), the system that makes e-mail possible, and file transfer protocol (FTP), for transmitting large messages. To maximize effectiveness, ARPANET broke messages into small pieces, or packets, that could easily be transmitted and reassembled. The technique, known as packet switching, enhanced communication between computers.
The 1970s: TCP/IP. During the 1970s, ARPA (now known as the Defense Advanced Research Projects Agency, or DARPA) continued its efforts to connect its users, but it eventually ran into a dead-end posed by the primitive systems of networking used at the time. Faced with this roadblock, DARPA turned to two computer scientists, Vinton Cerf and Robert Kahn, who developed a design that revolutionized networks.
This was the transmission control protocol (TCP), which, coupled with the related Internet Protocol (IP), provided a mechanism for addressing messages and routing them to their destinations using an open architecture that connected standardized networks. In 1980, DOD adopted TCP/IP as its standard, and required all participants to adopt the protocol as of January 1, 1983. Some observers regard this event as the true birth of the Internet.
The 1980s: civilian agencies get involved. The 1980s saw use of computer networks expand to include civilian agencies. Among these was the National Science Foundation (NSF), which worked with five supercomputing centers spread across the country to create NSFNET, a "backbone" system intended to connect the entire nation. NSF succeeded in linking small local and regional networks to NSFNET. Other civilian participants in computer networks, which began to increasingly overlap with one another, included the Department of Energy and the National Aeronautics and Space Administration (NASA), as well as a number of private companies.
Also during this period, several independent consortiums took on themselves the task of organizing and policing the rapidly growing Internet. Among these were the Internet Engineering Task Force and the Internet Society, both of which are concerned with Internet standards, as well as the Internet Corporation for Assigned Names and Numbers (ICANN). The latter controls policy with regard to the assignment of domain names, including top-level domains such as .com for commercial enterprises, .gov for government offices, .edu for schools, and so on.
The Internet Explosion
The mid-1980s saw the birth of the first commercial computer networks, including Prodigy, Compuserve, and Quantum Computer Services. The first two would eventually recede in significance as larger companies took over the Internet, but the third—founded in 1985 and renamed America Online (AOL) in 1989—would eventually merge with publishing and entertainment conglomerate Time Warner to control a wide span of media. All of that lay far in the future, however, during the mid-1980s, as the few commercial participants developed their first subscriber bases and linked up to NSFNET through the Commercial Internet Exchange (CIX).
A number of technological innovations in the 1980s and early 1990s portended the explosive growth of the Internet that would take place in the next decade. Among these was the development of the personal computer or PC, as well as local area networks (LANs), which linked computers within a single business or location. NSFNET, working with the Corporation for National Research Initiatives, sponsored the first commercial use of e-mail on the Internet. Then, in 1993, new legislation at the federal level permitted the full opening of the NSFNET to commercial users.
The result was much like the opening of lands in the western United States to homesteaders, only the "land" in this case existed in virtual or cyberspace, and instead of wagons, the new settlers used browsers. The first important browser was Mosaic, developed at the University of Illinois using standards created at the European Organization for Nuclear Research (CERN) by Tim Berners-Lee. Thus was born the World Wide Web, which uses hypertext transfer protocol, or HTTP. In this environment, Mosaic—known as Netscape Navigator after the formation of the Netscape Communications Corporation in 1994—and Microsoft's competing Internet Explorer would prove the most useful navigating tools.
Users of the Internet today can still travel to regions beyond the World Wide Web, where they can see what the Internet was like prior to 1993. The most significant surviving portion of this older section is Usenet, a worldwide bulletin board system containing some 14,000 forums or newsgroups. In addition to the Web and Usenet, the Internet includes e-mail (electronic mail), FTP sites (used for transferring pictures and other large files), instant messaging, and other components. At the edges of the Internet are proprietary services such as those accessible only to AOL users, as well as other pay sites. Additionally, company and government intranets (private networks accessible only through a password) lie beyond the periphery of the Internet, though a browser may be used to access both.
By 1988, the size of the Internet was doubling every year, and the advent of browsers made possible an enormous consumer influx. The mid-to late 1990s saw the formation of thousands of Internet service providers (ISPs), through which users gained access to the Internet in exchange for a monthly fee. As competition increased, fees decreased, forcing consolidation of providers. By the beginning of the twenty-first century, major companies such as AOL, AT&T, and Earthlink, along with a few second-tier ISPs, controlled most of the market.
The explosive growth of the Internet itself, coupled with the expanded opportunities for commerce it provided, fueled one of the greatest periods of economic growth in U.S. history, from 1996 to 2000. The economic downturn that began in April, 2000, and continued throughout the early 2000s, however, served as an indicator that the Internet—while it had certainly transformed communications—would not solve all problems.
There were several problems associated with the Internet itself, and simplest among these were the technological challenges involved in moving ever larger amounts of data. By the beginning of the twenty-first century, it became possible to access video and complex graphics using powerful data streams, and computer scientists envisioned technology that would make possible the use of high-resolution video or multiple streams on networks capable of processing 100 gigabits of data a second. To expand the number of available addresses, hitherto limited by the 32-bit IP address standard, the Internet Engineering Task Force in 1998 approved a new 128-bit standard. This made possible so many addresses that every electronic device in the world could have its own unique location in an ever-expanding Internet.
Less simple were some of the challenges associated with human activities. There were cybercrimes, such as hacking or the dissemination of viruses, either of which could be used simply as a form of information-age vandalism, or for extortion. Hacking of financial service sites also offered the opportunity to commit robbery without picking locks, and for this reason many companies adopted secure, encrypted sites. (The latter were designated by the prefix https://, in contrast to the ordinary http://. )
Just as the Internet could be used for education, commerce, and a host of other purposes, it also provided a forum for activities that tested the limits of free speech; extremist political parties and hate groups could operate a Web site. On the other hand, use of the Web to distribute drugs, weapons, or child pornography carried stiff penalties. At the same time, government attempts to restrict or control aspects of the Internet raised concerns over the abrogation of First Amendment rights. The Internet itself was worldwide, beyond the reach of even the U.S. Constitution or any law, and although China's totalitarian regime attempted to restrict citizens' access to it, the network continued to work its way deeper and deeper into the fabric of modern life.
█ FURTHER READING:
Gillies, James, and R. Cailliau. How the Web Was Born: The Story of the World Wide Web. New York: Oxford University Press, 2000.
Hafner, Katie, and Matthew Lyon. Where Wizards Stay Up Late: The Origins of the Internet. New York: Simon & Schuster, 1996.
Young, Gray, ed. The Internet. New York: H. W. Wilson, 1998.
Defense Advanced Research Projects Agency. <http://www.darpa.mil/> (April 14, 2003).
Internet Society. <http://www.isoc.org/> (April 14, 2003).
Webopedia: Online Dictionary for Computer and Internet Terms. <http://www.webopedia.com/> (April 14, 2003).
Computer Software Security
DARPA (Defense Advanced Research Projects Agency)
Internet: Dynamic and Static Addresses
Internet Spam and Fraud
Internet Tracking and Tracing
NSF (National Science Foundation)
KNIGHT, JUDSON. "Internet." Encyclopedia of Espionage, Intelligence, and Security. 2004. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3403300398.html
KNIGHT, JUDSON. "Internet." Encyclopedia of Espionage, Intelligence, and Security. 2004. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3403300398.html
For many people, a good deal of the day is spent online. The ability to send e-mail messages and "surf" the World Wide Web has already become matter-of-fact. But an amazing amount of technology and mathematics must occur for e-mail and Internet access to be successful.
A Brief History of the Internet
The general consensus is that the conception of the Internet occurred in the early 1960s as part of the Department of Defense's Advanced Research Projects Agency (ARPA), which was conceived and headed by J. C. R. Licklider from the Massachusetts Institute of Technology. The intent was to share supercomputers among researchers in the United States.
Because computers in the 1960s were so large and expensive, it was important to find a way for many people, often at different locations, to be able to use the same computer. By the end of the decade, ARPANET was developed to solve this problem, and in 1969 four universities—Stanford, University of California–Los Angeles, University of California–Santa Barbara, and the University of Utah—were the first to be successfully connected.
The ARPANET was not available for commercial use until the late 1970s. By 1981 there were 213 different hosts (central computers) available on the ARPANET, although many were completely incompatible with one another because each "spoke" a different language. Things were somewhat disjointed until Bob Kahn and Vint Cerf created TCP/IP (Transfer Control Protocol/Internet Protocol), which became the common language for all Internet communication. This transformed the disparate collection known as ARPANET into one cohesive group, the Internet.
Even though the intent of the ARPANET and Internet was to allow researchers to share data and access remote computers, e-mail soon became the most popular application to communicate information. In the 30-plus
years since then, not much has changed. In an average week, approximately 110 million people are online in the United States. If, on average, each of those people sends ten e-mails per week (a conservative estimate), then there are more than a billion e-mails sent every week.
Traveling on the Internet
Although e-mail is something that is often taken for granted, a great deal must happen for an e-mail message to go from one device to another. Depending on its destination, an e-mail message's travel path can be either very short or very long.
Sending e-mail is similar in some ways to sending a letter through regular mail: there is a message, an address, and a system of carriers that determines the best way to deliver the mail. The biggest differences between sending e-mail and regular mail are the first and last steps.
When an e-mail message is sent, it is first broken down into tiny chunks of data called "IP packets." This is accomplished by a mailing program (such as Outlook Express or Eudora) using the TCP Internet language. These packets are each "wrapped" in an electronic envelope containing web addresses for both the sender and recipient.
Next, the packets are sent independently through the Internet. It is possible that every single packet (and there can easily be hundreds of them) is sent on a different path. They may go through many levels of networks, computers, and communications lines before they reach their final destination.
The packets' journey begins within the Internet Service Provider (ISP) or network (AOL or MSN, for example), where the address on the envelopes is examined. Addresses are broken into two parts: the recipient name and the domain name. For example, in an e-mail message sent to John_Doe@msn.com, "John_Doe" is the recipient name and "msn.com" is the domain name.
Based on the domain name, the router (a piece of equipment that determines the best path for the packets to take) will determine whether the packets remain on the network or need to be sent to a different router. If the former is the case, the packets are sent directly to the recipient's e-mail program and reassembled using TCP.
If the recipient is on a different network, things get more complex. The packets are sent through the Internet, where an Internet router determines both where they need to go and the best path to get there. Decisions like these are made by problem-solving programs called algorithms , which find the optimal path for sending the packets.
Each packet is sent from one network to another until it reaches its final destination. Because they determine where the packets should go, routers can be likened to different transportation stations within a huge transportation system containing buses, trains, and airplanes. To get from one part of the world to another, a message may have to go through several stations and use multiple types of transportation.
For example, assume that two travelers are both starting in New York City and heading for Los Angeles. They get separated and end up taking different modes of transport yet still end up at the same point. This is what happens to the packets when they make the trip from the originating computer to their eventual destination; that is, they can get separated and sent on different paths to their final destination. Routers determine the optimal path for each packet, depending on data traffic and other factors.
The packets often arrive at the final destination at different times and in the wrong order. The recipient will not see an e-mail message until all of the packets arrive. They are then recombined in the correct order by the recipient's mail program, using TCP, into a message that the recipient can read.
How quickly all of this occurs can be influenced by many factors, some within the control of the e-mail user and others beyond it. One factor that can be controlled is the way information is received and sent to and from the originating computer. Popular types of connections available in 2001 are telephone modems, DSL (Digital Subscriber Line), Cable, T1 and T3.
Telephone modems are the earliest and slowest of the possible types of connections. In relation to the transportation metaphor used previously, they would be the buses. Under optimal conditions, one can download or upload information at rates of between 14 and 56 kbps (kilobits per second) with a modem. (One kilobit equals one thousand bits.) A bit is what makes up the data that are sent.*
*Eight bits equals one byte, and one byte equals a single character (a letter or numeral).
Actual transmission speeds for modems tend to be much slower than the optimal speeds because there is a vast, constant stream of data being transferred back and forth. Compare this to driving on a highway. Even though the speed limit may be 65 miles per hour (mph), because of traffic and road conditions, one may need to drive less than 65 mph. On the Internet, it is almost always rush hour.
Under perfect conditions, the 56,000 characters of data per second—which comes out to over 3 million characters per minute—that can down-loaded may sound like a lot of information, but it really is not. Most text messages (such as e-mail messages) are relatively small and will download quickly using a modem. Audio, video, or other multimedia files, however, cause more of a problem. These files can easily be upwards of 5 or 10 million bytes each, and thus use a much greater bandwidth .
Faster alternatives to modems are now widely available. The most common alternatives for home use are DSL and cable modems. DSL works through the phone line. Speeds for DSL tend to be in the range of 1.5 mbps (megabits per second). One megabit is equal to 1,000 kilobits.
Cable modems, unlike DSL, have nothing to do with phone lines. Cable modems transmit data using the cable that carries cable television signals. They offer fast speeds of up to 6 mbps. Even though this is a very good speed, an ISP may limit the available bandwidth, which restricts the size of files that can be uploaded or downloaded.
For large companies, universities, and the Internet Service Providers, speeds need to be high and bandwidths need to be enormous. T1 and T3 lines, which are dedicated digital communication links provided by the telephone company, are used for this purpose. They typically carry traffic to and from private business networks and ISPs, and are not used in homes.
Both T3 and T1 lines have their advantages in certain areas. With T3 connections one can potentially access speeds of nearly 45 mbps, or somewhere around one thousand times that of a modem. Transmission speeds for T1 lines are considerably slower, running at 1.5 mbps. The advantage of T1 is privacy. T1 connection lines are not shared with other users. In contrast, T3 connection lines (as well as modems, cable, and DSL) are shared.
Consider the highway metaphor once again. Having a T1 line is like maintaining a private two-lane highway on which only certain people are allowed to drive. Having a T3 line is more like driving on a 4-lane auto-bahn (the highway system in Germany, where there is no speed limit), with three of the lanes clogged up with slow-moving trucks. On the autobahn the potential exists to go very fast, but the traffic often prevents drivers from reaching high speeds. So whether the T1 or T3 is more desirable depends on which is more valued—speed or privacy.
When an e-mail message is sent, there is a very good possibility that the packets will encounter nearly all of these types of connections on their journeys—just like people can use planes, trains, and automobiles. The next time you hit the "send" button, think about all of the logical and mathematical operations that are about to happen.
see also Computers and the binary system; Internet data, reliability of; Numbers, massive.
Philip M. Goldfeder
Abbate, Janet. Inventing the Internet. Cambridge, MA: MIT Press, 1999.
Gralla, Preston. How the Internet Works: Millennium Edition. Indianapolis: QUE, 1999.
Lubka, Willie, and Nancy Holden. K •I •S •S Guide to the Internet. New York: Dorling Kindersley, 2000.
Average Weekly Web Usage: United States. <http://www.nielsen-netratings.com>.
Brain, Marshall. How E-mail Works. <http://www.howstuffworks.com/email1.htm>.
Finnie, Scot. 20 Questions: How the Net Works. <http://coverage.cnet.com/Content/Features/Techno/Networks/index.html>.
Frequently Asked Questions About T1. <http://www.everythingt1.com/faq.html>.
Timeline: PBS Life on the Internet. <http://www.pbs.org/internet/timeline/index.html>.
Goldfeder, Philip M.. "Internet." Mathematics. 2002. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3407500154.html
Goldfeder, Philip M.. "Internet." Mathematics. 2002. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3407500154.html
INTERNET. Arguably the most important communications tool ever created, the Internet connects millions of people to online resources each day. Grown from seeds planted during the Cold War, the roots of the Internet were formed to develop a reliable, national system for communications. Although early pioneers disagree over whether the computer-based communications network was built to withstand nuclear attack, the uneasy tension between the United States and the Soviet Union during the Cold War certainly increased the resolve of the United States to fund and develop relevant scientific and defense-related projects aimed at national security.
Home to many of the preeminent scientists of the time, the Massachusetts Institute of Technology (MIT) served as the birthplace of the Internet. It was there, in Cambridge, Massachusetts, that President Harry Truman's administration formed MIT's Lincoln Laboratories to begin work on the Semi-Automatic Ground Environment. SAGE's primary goal was to develop an air defense system that involved a network of interconnected computers across the United States. The push for advanced technology received an even larger boost in August 1957, when the Soviet Union test fired its first intercontinental ballistic missile and subsequently launched its Sputnik orbiter in October of that same year. Shortly thereafter, President Dwight D. Eisenhower convened a meeting of his Presidential Science Advisory Committee. From that meeting and subsequent congressional testimony on the progress of U.S. defense and missile programs, it became clear that the "science gap" between the two superpowers had widened. Eisenhower sought funding for the Advanced Research Projects Agency (ARPA) late in 1957 and obtained it the following year.
In the early 1960s, the Lincoln Laboratory researchers Lawrence Roberts and Leonard Kleinrock worked on developing a method of digitizing and transmitting information between two computers using a communications method called packet switching. Similar work on systems that used store-and-forward switching was also underway in the late 1950s under the direction of Paul Baran and Donald Davies at the National Physical Laboratory in England. At the heart of both research projects was the development of a communications system in which information would be distributed among all nodes on a network, so that if one or more nodes failed, the entire network would not be disabled. This type of network, in which messages were passed from node to node, with no single node responsible for the end-to-end traffic, was called hot-potato routing.
ARPA's first director, J. C. R. Licklider, moved from Lincoln Laboratory to a small Cambridge, Massachusetts–based consulting firm, Bolt, Beranek, and Newman (BBN), where researchers continued to explore the use of computers as tools of communication. While there, Licklider and his colleagues developed the necessary hardware to connect computers to telephone lines and also researched the collection of data from a wide array of other sources including antennae, submarines, and other real-time sensors. Most of BBN's projects were ARPA supported and sought to achieve ARPA's ultimate goal of helping close the science gap by creating a nationwide network of interconnected computers.
In the summer of 1968, ARPA issued a request for proposals to more than 130 different research centers with the goal of creating a digital network of computers conforming to ARPA's technical specifications. Roberts developed the criteria and served as the chief architect of the network's overall design, which included the deployment of "packet switching technology, using half-second response time, with measurement capability, and continuous operation"—that is, an Internet. Frank Heart and the team of scientists at BBN were awarded the contract in December 1968.Outfitted with specialized minicomputers and interface hardware, BBN set out to connect their "packet switches" or Interface Message Processors
(IMPs), at each ARPA-determined remote location (node), which would then communicate with the host computer at that location. Robert Kahn and Vincent Cerf, with Jon Postel and Charles Kline, developed the software to connect host computers to the IMPs, a host-to-host protocol on how packets would be routed. While America was absorbed in NASA's race to land on the moon in the summer of 1969, BBN air shipped its first IMP computer across the country—no small feat for the time. It arrived safely and was working at the first node, the University of California at Los Angeles, in August 1969.
This phase of the ARPA-BBN project was completed in nine months. Meanwhile, work continued on equipping the second node, the Stanford Research Institute (SRI) in Palo Alto—some four hundred miles away—to the interface message processor. On 1 October 1969 the Stanford node came online and the first message, "LO," was passed that day. BBN continued to progress, installing nodes three and four at the University of California at Santa Barbara (1 November 1969) and the University of Utah (1 December 1969).Only in March of the following year did BBN connect its Cambridge offices to the newly created ARPAnet.
The ARPAnet continued to evolve through the early 1970s with the addition of more diverse data networks such as the University of Hawaii's ALOHAnet packet radio network and the European-based packet satellite network. During this period, the first terminal interface processor (TIP) was introduced to the network, thereby allowing computer terminals to call directly into the ARPAnet using standard telephone lines. In 1972, the first electronic messaging program (e-mail) that supported incoming and outgoing messages was developed. In that same year, a file transfer protocol specification (FTP) to allow for the transmission of data files across the network was designed and tested. With these additions, ARPAnet truly began to fulfill its mission as an open-architecture network, accommodating a variety of different environments and allowing the free sharing of resources.
As the uses of the network grew, more efficient methods for carrying data were needed, forcing an evolution of transmission protocols—the underlying control layer in which the messages flowed—and addressing schemes. After many refinements, TCP/IP (transmission control protocol/Internet protocol) became the de facto standard for communicating on the network. A naming scheme also became necessary and the Domain Name System (DNS) was developed by Paul Mockapetris of the University of Southern California. DNS allowed for the assignment of names to networks and nodes, supplanting the use of numeric addresses. In 1973, Ethernet technology was developed, allowing for the rapid addition of nodes and workstations to the network. With the birth of the personal computer and local area networks (LANs) in the early 1980s, the network grew at a staggering pace.
The federal government funded the network and its infrastructure through 1995.The work of the National Science Foundation (NSF) was instrumental for under-standing the future evolution of the Internet as a true "information superhighway." However, federal funding of the Internet was terminated as a result of the NSF's privatization initiative to encourage commercial network traffic. Control of the large backbones of the network—the set of paths with which local or regional networks connected for long-haul connectivity—was redistributed to private regional network service providers.
The Internet serves as a vital network of communication in the form of e-mail, news groups, and chat. It also provides unparalleled resource sharing and resource discovery through the World Wide Web. At the end of 2001, the Internet continued its phenomenal annual rate of growth of 100 percent. At its start in 1981, the Internet connected just over two hundred researchers and scientists. By the end of 2002, it is estimated that the Internet had the capacity to reach more than six billion people worldwide.
Abbate, Janet. Inventing the Internet. Cambridge, Mass.: MIT Press, 1999.
Hauben, Michael, and Ronda Hauben. Netizens: On the History and Impact of Usenet and the Internet. Los Alamitos, Calif.: IEEE Computer Society Press, 1997.
Quarterman, John S., and Smoot Carl-Mitchell. The Internet Connection: System Connectivity and Configuration. Reading, Mass.: Addison-Wesley, 1994.
Segaller, Stephen. Nerds 2.0.1: A Brief History of the Internet. New York: TV Books, 1998.
"Internet." Dictionary of American History. 2003. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3401802121.html
"Internet." Dictionary of American History. 2003. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3401802121.html
the Internet, international computer network linking together thousands of individual networks at military and government agencies, educational institutions, nonprofit organizations, industrial and financial corporations of all sizes, and commercial enterprises (called gateways or service providers) that enable individuals to access the network. The most popular features of the Internet include electronic mail (e-mail), blogs (web logs or journals), discussion groups (such newsgroups, bulletin boards, or forums where users can post messages and look for responses), on-line conversations (such as chats or instant messaging), wikis (websites that anyone on the Internet can edit), adventure and role-playing games, information retrieval, electronic commerce (e-commerce), Internet-based telephone service (voice over IP [VoIP]), and web mashups (in which third parties combine their web-based data and services with those of other companies).
The public information stored in the multitude of computer networks connected to the Internet forms a huge electronic library, but the enormous quantity of data and number of linked computer networks also make it difficult to find where the desired information resides and then to retrieve it. A number of progressively easier-to-use interfaces and tools have been developed to facilitate searching. Among these are search engines, such as Archie, Gopher, and WAIS (Wide Area Information Server), and a number of commercial, Web-based indexes, such as Google or Yahoo, which are programs that use a proprietary algorithm or other means to search a large collection of documents for keywords and return a list of documents containing one or more of the keywords. Telnet is a program that allows users of one computer to connect with another, distant computer in a different network. The File Transfer Protocol (FTP) is used to transfer information between computers in different networks. The greatest impetus to the popularization of the Internet came with the introduction of the World Wide Web (WWW), a hypertext system that makes browsing the Internet both fast and intuitive. Most e-commerce occurs over the Web, and most of the information on the Internet now is formatted for the Web, which has led Web-based indexes to eclipse the other Internet-wide search engines.
Each computer that is directly connected to the Internet is uniquely identified by a binary number called its IP address. Most computers presently use an Internet Protocol version 4 (IPv4) address, which is 32 bits in size. This address is usually seen as a four-part decimal number, such as 18.104.22.168, with each part equating to 8 bits (1 byte) of the 32-bit address in the decimal range 0–255; the parts are separated by dots (periods). Although the number of addresses available under IPv4 is roughly 4.3 billion, the number of unassigned addresses will soon be depleted.
The Internet is transitioning to IP version 6 (IPv6) addressing, which uses 128 bits to represent an address. An IPv6 address is usually represented as an eight-part hexadecimal number (see numeration); each part is equivalent to 16 bits (2 bytes) of the 128-bit address in the hexadecimal range 0000–ffff, and colons are used to separate the parts. An IPv6 address such as 1234:0000:0000:0000:1234:5678:9abc:deff may also be represent by a shorthand version, 1234::1234:5678:9abc:deff, which does not show bytes with a zero value. IPv6 allows for some 3.4 × 1038 addresses.
Because an address of the form 22.214.171.124 is usually difficult to remember, a system of Internet addresses, or domain names, was developed in the 1980s. An Internet address is translated into an IP address by a domain-name server, a program running on an Internet-connected computer. Reading from left to right, the parts of a domain name go from specific to general. For example, www.college.columbia.edu is a World Wide Web site for Columbia College, which is part of Columbia Univ., which is an educational institution. The rightmost part, or top-level domain (or suffix or zone), can be a two-letter abbreviation of the country in which the computer is in operation; more than 250 abbreviations, such as "ca" for Canada and "uk" for United Kingdom, have been assigned. Although such an abbreviation exists for the United States (us), it is more common for a site in the United States to use a generic top-level domain such as edu (educational institution), gov (government), or mil (military) or one of the four domains originally designated for open registration worldwide, com (commercial), int (international), net (network), or org (organization). In 2000 seven additional top-level domains (aero, biz, coop, info, museum, name, and pro) were approved for worldwide use, and other domains, including the regional domains asia and eu, have since been added. In 2008 new rules were adopted that would allow a top-level domain to be any group of letters, but the final approval for proceeding with the creation of such domain names (beginning in 2012) waited until 2011. In 2009 further rules changes permitted the use of other writing systems in addition to the Latin alphabet in domain names (beginning in 2010).
The Internet evolved from a secret feasibility study conceived by the U.S. Dept. of Defense in 1969 to test methods of enabling computer networks to survive military attacks, by means of the dynamic rerouting of messages. As the ARPAnet (Advanced Research Projects Agency network), it began by connecting three networks in California with one in Utah—these communicated with one another by a set of rules called the Internet Protocol (IP). By 1972, when the ARPAnet was revealed to the public, it had grown to include about 50 universities and research organizations with defense contracts, and a year later the first international connections were established with networks in England and Norway.
A decade later, the Internet Protocol was enhanced with a set of communication protocols, the Transmission Control Program/Internet Protocol (TCP/IP), that supported both local and wide-area networks. Shortly thereafter, the National Science Foundation (NSF) created the NSFnet to link five supercomputer centers, and this, coupled with TCP/IP, soon supplanted the ARPAnet as the backbone of the Internet. In 1995 the NSF decommissioned the NSFnet, and responsibility for the Internet was assumed by the private sector. Progress toward the privatization of the Internet continued when Internet Corporation for Assigned Names and Numbers (ICANN), a nonprofit U.S. corporation, assumed oversight responsibility for the domain name system in 1998 under an agreement with the U.S. Dept. of Commerce.
Fueled by the increasing popularity of personal computers, e-mail, and the World Wide Web (which was introduced in 1991 and saw explosive growth beginning in 1993), the Internet became a significant factor in the stock market and commerce during the second half of the decade. By 2000 it was estimated that the number of adults using the Internet exceeded 100 million in the United States alone; in 2010 it was estimated that there were 2 billion Internet users worldwide. The increasing globalization of the Internet has led a number of nations to call for oversight and governance of the Internet to pass from the U.S. government and ICANN to an international body, and revelations of U.S. Internet spying beginning in 2013 gave new impetus to such calls. A 2005 international technology summit agreed to preserve the status quo while establishing an international forum for the discussion of Internet policy issues. In 2014 the U.S. Commerce Dept. announced its intention to hand over control by Sept., 2015, to a body consisting of business, government, and other representatives.
See S. Coleman and J. G. Blumler, The Internet and Democratic Citizenship (2009); J. Ryan, A History of the Internet and the Digital Future (2010); J. Brockman, ed., Is the Internet Changing the Way You Think? (2011); S. Levmore and M. C. Nussbaum, ed., The Offensive Internet (2011); E. Pariser, The Filter Bubble: What the Internet Is Hiding from You (2011); J. Lanier, You Are Not a Gadget: A Manifesto (2011).
"Internet, the." The Columbia Encyclopedia, 6th ed.. 2016. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1E1-Internet.html
"Internet, the." The Columbia Encyclopedia, 6th ed.. 2016. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1E1-Internet.html
The Internet is a computer network that was designed to interconnect other computer networks. Its origins lie in the ARPANET, an experimental network designed for the U.S. Department of Defense Advanced Research Projects Agency (ARPA) in 1969. The original ARPANET had some features that were unique in its day.
The first unique feature was that it supported peer to peer networking. In this system, each computer has the same rights and abilities as any other computer on the network. The commercial computer networks at that time were hierarchical, where some devices performed special control functions, and other devices had to wait for permission to transmit from the controller.
Another unique feature of ARPANET was that it was not designed with a particular application or set of applications in mind. The designers created a network whose uses were not fully specified. As a result, ARPANET was designed to be transparent to applications. This allowed new Internet applications to be developed by placing the necessary functions (usually computer software) in end user devices rather than in the network. Thus, new applications did not require changes to the network.
Yet another unique feature of ARPANET was that it allowed organizations to have operational control of their local networks while still allowing them to be interconnected. This made it possible for a computer at a Burger King restaurant to communicate with a computer at a McDonald's restaurant without forcing the management at either restaurant to give up local autonomy for the privilege of communicating with each other.
In the 1980s, ARPANET split into a military component and a civilian section. The civilian part became known as NSFnet, in acknowledgement of support from the National Science Foundation. Other developments in this decade included the development of local area networks (LANs) , which pushed peer to peer networking closer to many end users, and the microcomputer, or personal computer, which made it possible for many people to have dedicated computer access. NSFnet was limited by its charter to educational and not-for-profit organizations. Although commercial firms began to see the advantages of NSFnet, they were not able to participate fully in this new age of communications until NSFnet was privatized in 1993.
The Internet has grown in leaps and bounds since privatization, fueled by the emergence of a new application, the World Wide Web, and the resources of the private sector.
The Internet has become a change agent in many areas of the economy. Examples of this include retail sales, business to business transactions, telephone and video carriage, and music distribution. In fact, few industries have not been touched in a significant way by the Internet. Many industries have reorganized themselves as a direct result of the economic changes brought about by Internet-based applications.
For the most part, computers on the Internet communicate via two communications protocols: the Transmission Control Protocol (TCP) and the Internet Protocol (IP). The role of finding a path through a complex network is left to IP. This is a "best effort" protocol, in that it does the best it can to deliver a packet to the desired destination, but makes no promises. Thus, if a portion of the network failed, IP would attempt to reroute around the failure if it could, but would not guarantee that all packets would survive intact. Many applications require stronger assurances than this, and that is the role of TCP. The TCP is a communications protocol that operates between two end devices, ensuring that the complete information that was transmitted arrives safely at the destination. If some of the information is lost by IP, TCP retransmits it until it is received correctly. Thus, the two protocols operate in tandem to provide a complete, reliable service to end users.
The Internet differs from telephone networks in that information is broken into packets, each of which is treated separately, much like a letter. The Internet allocates its resources to individual packets as needed. By contrast, the telephone network treats a telephone call as a stream of information, and allocates resources to that call (or stream of information) regardless of whether the users are speaking or are silent. In a packet network, resources are allocated only when there is information to transmit. This packet switching feature is commonly found in computer networks.
Physically, the Internet consists of special purpose computers called routers that are interconnected with each other. Routers are equivalent to switches in the telephone network, in that they decide what to do with a packet when it arrives from a neighboring router. This decision is aided by a routing table, which is used by the router to determine where the packet should be sent next. The routing tables are constructed by the routers themselves, which communicate with each other so that efficient paths through the network can be found for packets traveling between any pair of destinations, and so that congested or failed routers can be avoided.
Today, many users access the Internet through Internet Service Providers (ISPs) . For a monthly fee, an ISP provides users with a way of accessing the Internet (usually via a dialup modem), an electronic mail address and mailbox, and, often, a page that can be viewed by World Wide Web browsers. These retail ISPs often interconnect with large, high capacity backbone ISPs, which provide the transport functions so that a packet from one user can reach any other user.
The Internet is a constantly changing resource. It has had a deep impact on industries and on the lives of many Americans. The collection of computer networks known as the Internet will probably continue to affect society in ways that we are still trying to understand.
see also E-commerce; Government Funding, Research; Internet: Applications; Internet: Backbone; Internet: History; Intranet; Networks; Routing; Telecommunications; World Wide Web.
Martin B. Weiss
Dodge, Martin, and Rob Kitchin. The Atlas of Cyberspace. New York: Addison-Wesley, 2001.
Sutherland, Keith. Understanding the Internet: A Clear Guide to Internet Technologies. Boston: Butterworth-Heinemann, 2000.
Weiss, Martin B.. "Internet." Computer Sciences. 2002. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3401200082.html
Weiss, Martin B.. "Internet." Computer Sciences. 2002. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3401200082.html
The Internet is a product of the Cold War. It was originally developed by the Government of the United States during the 1970s as a means of sharing information and protecting communications in the event of a nuclear attack. During the 1980s it developed quickly, first into an academic exchange network, then as a means of mass electronic communication available in principle to anyone having access to a personal computer and a telephone line. In 1995 approximately 1 million people were using the Web. Two years later this total had reached an estimated 40 million. Between 1993 and 1997 the number of accessible pages on the Web grew from around 130,000 to more than 30 million. Most sites offer free access. (Pornographic ‘clubs’ are a rare exception since membership usually carries a monthly or annual charge.)
Most users find information by using one of many ‘search engines’ that are available. These are fast computers which produce organized lists of relevant Web sites in response to a query about particular topics or key-words. For example, typing in the name of a multinational corporation (such as ‘Nissan’) will generate dozens of sites giving information about the company's current products, economic performance, manufacturing capacity, retailing outlets, and so forth. Many of these will be ‘official’, being maintained by the company or its agents, but some will be unofficial sites supported by Nissan enthusiasts.
Use of the Internet continues to grow rapidly all over the world. The social implications of this are contested. It has been argued that the Internet is the greatest technological development of the twentieth century, comparable in importance to (say) the invention of printing, or even of electricity. It could change the way economies function, for example by depressing prices (as customers increasingly have the facility to search the globe for the cheapest products), holding down wages (some tasks can be farmed out electronically to cheap labour-markets), or making it possible for people to work from home. There are companies which subcontract routine administrative work (such as maintaining their personnel records) via the Internet to Third World agencies which can pay computer staff lower wages than would be required in the West. Increasingly, it is possible to shop on-line (for example to buy airline tickets direct from airlines), and this may affect the structure of retailing. Some forecasts suggest that the resulting so-called technological deflation may depress prices by as much as 25 or 30 per cent over the next decade.
There are also more than 3,500 sites where one can search for a job. This is said to be affecting the US labour-market, since people on the East Coast can explore vacancies in the West that they otherwise would not be aware of, and vice versa. The increasing availability of digital products (including on-line magazines and films) may lead to an under-recording of economic activity by conventional measures (such as Gross National Product) and may make it difficult for governments to collect certain taxes. Some observers maintain that the existence of the Web makes totalitarian regimes less likely to succeed, because the effects of propaganda can readily be countered by accessing alternative sources of information on the Web, and it has even been suggested that this will make new forms of participatory democracy possible in the near future.
Sceptics argue that much of the information available on the Internet is trivial. They also point out that global ‘Netizenship’ is restricted to those who can afford a personal computer, a modem to link it to the world's telephone lines, and who can then pay the associated running costs. It is estimated that in Britain, for example, fewer than 2 million people have PCs (as compared to 22 million households which have television sets). More than 96 per cent of Internet sites are located in the most affluent 27 nations. Fluency in English is virtually a prerequisite of Net use. This information revolution could therefore be creating a new international division of the world into a small group of ‘information-rich’ countries and individuals and a dispossessed majority who will be excluded from this particular form of power. The computers which serve the system also seem to be permanently on the edge of collapsing under the weight of demand. New capacity constantly has to be installed. Users often complain of information overload.
At the time of writing, there are signs that some of these shortcomings may be overcome by the mass production of cheap ‘network computers’ (which do not contain expensive components such as hard drives), and by making television the key medium through which the Web is accessed. This may make the Internet a truly universal and affordable source of information. If the problem of finding a secure method for payment of goods bought over the Web is also solved then the prospects for transforming retailing and other markets will also be dramatically enhanced. On the history of the Internet, and its possible implications for the organization of work, leisure, and politics, see Rob Shields ( ed.) , Cultures of Internet (1996
). See also CYBERSOCIETY; TELECOMMUTING.
GORDON MARSHALL. "Internet." A Dictionary of Sociology. 1998. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1O88-Internet.html
GORDON MARSHALL. "Internet." A Dictionary of Sociology. 1998. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1O88-Internet.html
Biologists often use two terms to describe alternative approaches for conducting experiments. "In vitro" (Latin for "in glass") refers to experiments typically carried out in test tubes with purified biochemicals. "In vivo" ("in life") experiments are performed directly on living organisms. In recent years, the indispensable use of computers and the Internet for genetic and molecular biology research has introduced a new term into the language: "in silico" ("in silicon"), referring to the silicon used to manufacture computer chips. In silico genetics experiments are those that are performed with a computer, often involving analysis of DNA or protein sequences over the Internet.
Geneticists and molecular biologists use the Internet much the same way most people do, communicating data and results through e-mail and discussion groups and sharing information on Web sites, for instance. They also make wide use of powerful Internet-based databases and analytical tools. Researchers are determining the DNA sequences of entire genomes at an ever accelerating pace, and are devising methods for cataloging entire sets of proteins (termed "proteomes") expressed in organisms. The databases to store all this information are growing at an equal pace, and the computer tools to sort through all the data are becoming increasingly sophisticated.
One of the most important Web sites for biological computer analysis (sometimes called bioinformatics ) is that of the National Center for Biotechnology Information (NCBI), a part of the National Library of Medicine, which, in turn, is part of the National Institutes of Health. The NCBI Web site hosts DNA and protein sequence databases, protein three-dimensional structure databases, scientific literature databases, and search engines for retrieving files of interest. All of these resources are freely accessible to anyone on the Internet.
Of all the powerful analytical tools available at NCBI, probably the most important and heavily used is a set of computer programs called BLAST, for Basic Local Alignment Search Tool. BLAST can rapidly search many sequence databases to see whether any DNA or protein sequence (a "query sequence," supplied by the user) is similar to other sequences. Since sequence similarity usually suggests that two proteins or DNA molecules are homologous (i.e., that they are evolutionarily related and therefore may have—or encode proteins—with similar functions), discovering a blast match between an unknown protein or nucleic acid sequence and a well-characterized sequence provides an immediate clue about the function of the unknown sequence. An important scientific discovery that, in the past, may have taken many years of in vitro and in vivo analysis to arrive at is now made in a few seconds, with this simple in silico experiment.
see also Bioinformatics; Genome; Genomics; Homology; Proteomics; Sequencing DNA.
Paul J. Muhlrad
Basic Local Alignment Search Tool. National Center for Biotechnology Information. <http://www.ncbi.nlm.nih.gov/BLAST/>.
Baxevanis, Andreas D. "The Molecular Biology Database Collection: 2002 Update." Nucleic Acids Research. Oxford University Press. <http://www3.oup.co.uk/nar/database/>.
ExPASy Molecular Biology Server. Swiss Institute of Bioinformatics. <http://ca.expasy.org/>.
Virtual Library of Genetics. U.S. Department of Energy. <http://www.ornl.gov/TechResources/Human_Genome/genetics.html>.
Wellcome Trust Sanger Institute. <http://www.sanger.ac.uk/>.
WWW Virtual Library: Model Organisms. George Manning. <http://ceolas.org/VL/mo/>.
Muhlrad, Paul J.. "Internet." Genetics. 2003. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3406500154.html
Muhlrad, Paul J.. "Internet." Genetics. 2003. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3406500154.html
An online network linking million of computers throughout the world, the Internet is used by millions of people for things like research, communication, and commerce transactions. Via technology that spawned the "information age," the Internet has become a tool millions of individuals employ every day for professional, educational, and personal exchanges. As the Internet's popularity has increased, so have the opportunities for making money online. The skyrocketing stock prices of Internet-based companies like Web browser firm Netscape, book retailer Amazon.com, and auction site ebay.com in the mid-1990s reflected common perceptions about the Internet's potential as a commerce tool. Although investors began shunning these stocks later in the decade as analysts started to examine the business models of Internet-based businesses more closely, the Internet already had been firmly established as a viable means of conducting commerce.
The precursor of the Internet, ARPAnet, was created in 1969 by the Advanced Research Projects Agency (ARPA) at the directive of U.S. Department of Defense, which sought a means for governmental communication in the event of nuclear war. To create what would become the world's largest wide area network (WAN), ARPA chose Interface Message Processors (IMPs) to connect host computers via telephone lines. To create the underlying network needed to connect the IMPs, ARPA hired Bolt Beranek and Newman, a Cambridge, Massachusetts-based research and development firm. The last component needed was a protocol, or a set of standards, that would facilitate communication between the host sites. This was developed internally by the Network Working Group. ARPAnet's Network Control Protocol allowed users to access computers and printers in remote locations and exchange files between computers. This protocol eventually was replaced by the more sophisticated Transmission Control Protocol/Internet Protocol (TCP/IP), which allowed ARPAnet to be connected with a several other networks that had been launched by various institutions. It was this group of networks that eventually formed the core of what later became known as the Internet. No longer useful, ARPAnet was shut down in 1990.
A National Science Foundation decree that prevented commercial use of the Internet was dissolved in 1991, the same year the World Wide Web came into existence. By then, personal computer use by businesses, institutions, and individuals had spiraled. When the graphics-based Web browsing program known as Mosaic was released in 1993, the Internet's growth exploded. Firms like Netscape and Yahoo! were founded soon after, making access to the Internet even easier. By 1996, an estimated 40 million individuals were accessing the Internet, and by 1999, that number had grown to 200 million.
"Internet." In Ecommerce Webopedia. Darien, CT: Inter-net.com, 2001. Available from e-comm.webopedia.com.
"Internet." In Techencyclopedia. Point Pleasant, PA: Computer Language Co., 2001. Available from www.techweb.com/encyclopediat.
"An Internet Time Line." PC Week. November 18, 1996.
National Museum of American History. "Birth of the Internet: ARPANET: General Overview." Washington, D.C.: Smithsonian Institution. Available from smithsonian.yahoo.com/arpanet2.
PBS Online. "PBS Life on the Internet: Timeline." Alexandria, VA: PBS Online, 2001.
SEE ALSO: ARPAnet; Berners-Lee, Timothy; Communications Protocol; History of the Internet and World Wide Web (WWW); Internet Infrastructure; MIT and the Galactic Network
"Internet." Gale Encyclopedia of E-Commerce. 2002. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3405300254.html
"Internet." Gale Encyclopedia of E-Commerce. 2002. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3405300254.html
The Internet is inherently decentralized, there is no single controlling organization, it is not operated for profit, and has been described as ‘anarchy by design’. This is because it grew out of the ARPAnet, an extensive military system created in 1969 by ARPA (the United States Defense Advanced Research Project Agency), which linked a number of US universities, research centres, etc., by means of an electronic ‘nervous system’ which had no headquarters. As a result, the ARPAnet could not be destroyed by an enemy strike at any one locality, and had in addition a capacity for rerouting information if any kind of disruption arose. For the same reason, no government or other organization can impose policy or watertight censorship on what transpires among users of the Internet, who have inherited a system created for very different reasons from those which make the Net useful for them. Most people now access the Net through commercial service providers, such as US-based America On-Line (AOL) and CompuServe and UK-based Demon Internet and Pipex. In 1981, only 213 computers were registered on the Internet, by 1989 there were c.80,000, by late 1990 over 300,000, in early 1992 over 700,000, by 1993 1–2m worldwide, and by 1996 probably more than 30m people in over 70 countries currently exchanging data, news, and comment. See COMPUTING, EMOTICON, NETIQUETTE, WORLD-WIDE WEB.
TOM McARTHUR. "INTERNET, The." Concise Oxford Companion to the English Language. 1998. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1O29-INTERNETThe.html
TOM McARTHUR. "INTERNET, The." Concise Oxford Companion to the English Language. 1998. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1O29-INTERNETThe.html
The Internet is an international system of interconnected computer networks of government, educational, nonprofit organization, and corporate computers. The computers and networks are connected to each other by high-speed data communications lines, and even dissimilar computers are able to exchange data with each other using a set of data communications protocols called TCP/IP (Transmission Control Protocol/Internet Protocol). TCP/IP supports Simple Mail Transfer Protocol (SMTP) to permit the sending of electronic mail (E-mail) messages, File transfer protocol (FTP) for moving files between computers, and telnet which makes it possible to log in and interact with a remote computer. TCP controls the transmission of data between computers, and IP controls the automatic routing of the data over what might be a chain of computers.
The Internet's structure is based on a predecessor network called ARPAnet, which was established by the U.S. Department of Defense's Advanced Research Project Agency (ARPA) in 1969 as an experiment to determine how to build a network that could withstand partial outages, such as from an enemy attack. Each computer on the network communicates with others as a peer instead of having one or a few central hub computers, which would be too vulnerable. In the late 1980s ARPAnet was replaced by NSFNET, run by the National Science Foundation, which expanded the network, replaced its telephone lines with faster ones, and funded more college and university connections to the network. Thus, educational institutions became the dominant users in the 1980s. Other organizations and corporations joined by linking their computers, local area networks (LANs), and wide area networks (WANs) to the Internet and adopting TCP/IP to connect their computers. As a result, the Internet comprises some networks that are publicly funded and some of which are private and which charge network access fees. Consequently, different users pay different fees, or none at all, for the same services. In the 1990s corporations and consumers became the biggest users of the Internet.
See also: Computer Industry
"Internet." Gale Encyclopedia of U.S. Economic History. 1999. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3406400457.html
"Internet." Gale Encyclopedia of U.S. Economic History. 1999. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3406400457.html
The Internet is global, with connections to nearly every country in the world; the qualification “nearly” is present in part because the number of countries connected continues to increase, and in part because the Internet is deliberately nonpolitical and tends to deal with nongovernmental levels within a country. The Internet is informal, with a minimal level of governing bodies and with an emphasis in these bodies on technical rather than on administration or revenue generation. To date (Spring 1995) the major users of the Internet have been the academic and research communities, but it is inevitable that this situation will change rapidly in the next few years with the growth in commercial interest in the exploitation of the Internet. In addition the flow of data across borders is a highly complex legal matter, involving the copyright and data protection legislation of the countries involved.
JOHN DAINTITH. "Internet." A Dictionary of Computing. 2004. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1O11-Internet.html
JOHN DAINTITH. "Internet." A Dictionary of Computing. 2004. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1O11-Internet.html
The Internet allows multimedia documents to be moved between any two computers, using an "internetwork" of relaying computers. Multimedia documents can be found by those seeking information using a web browser to "pull" information off the "World Wide Web," or using an e-mail system to "push" information to those currently uninterested or unaware of an issue.
The Internet has been called an "engine of empowerment" that creates healthy "virtual communities." Others, however, say it increases may social and health-related problems, including individual isolation and risky sexual practices by fragmenting relationships and by increasing the anonymous distribution and viewing of pornographic material. These seemingly contradictory outcomes can be reconciled in understanding that the Internet, like any communications technology, amplifies the intentions of its users. It amplifies these intentions by primarily increasing the "reach" of both the sender and receiver, who often share a common interest. As a result, its use may only increase the sharing of information that reinforces and amplifies preexisting life patterns.
(see also: Advertising of Unhealthy Products; Information System; Information Technology; Patient Education Media; Self-Help Groups; Social Health )
Chiasson, Mike. "Internet." Encyclopedia of Public Health. 2002. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3404000469.html
Chiasson, Mike. "Internet." Encyclopedia of Public Health. 2002. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3404000469.html
"Internet." World Encyclopedia. 2005. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1O142-Internet.html
"Internet." World Encyclopedia. 2005. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1O142-Internet.html
In·ter·net / ˈintərˌnet/ an international computer network providing e-mail and information from computers in educational institutions, government agencies, and industry, accessible to the general public via modem links.
"Internet." The Oxford Pocket Dictionary of Current English. 2009. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1O999-internet.html
"Internet." The Oxford Pocket Dictionary of Current English. 2009. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1O999-internet.html
ELIZABETH KNOWLES. "Internet, the." The Oxford Dictionary of Phrase and Fable. 2006. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1O214-Internetthe.html
ELIZABETH KNOWLES. "Internet, the." The Oxford Dictionary of Phrase and Fable. 2006. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1O214-Internetthe.html
See Information Technology
"Internet." Encyclopedia of Science and Religion. 2003. Encyclopedia.com. (June 25, 2016). http://www.encyclopedia.com/doc/1G2-3404200287.html
"Internet." Encyclopedia of Science and Religion. 2003. Retrieved June 25, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3404200287.html