A computer virus is a piece of software that "invades" a computer. As such, a computer virus is one of several kinds of infections, including Trojan horses and worms. Infections are themselves a subset of possible attacks on computers and networks; other attacks include probes, unauthorized access, denial of service, Internet sniffers, and large-scale scanning. This entry focuses on viruses, worms, and Trojan horses—collectively termed electronics infections—the three most common kinds of attacks and the ones best known by the public (Carnegie Mellon University Internet site). All such infections constitute multiple ethical and political issues: the responsibilities to protect against them, determining consequences for those responsible for attacks, and how to educate users about their vulnerabilities.
A virus is a piece of software that is hidden inside a larger program. When the larger program is executed, the virus is executed as well. During that execution, the virus can try to fulfill its purpose, often to replicate (that is, copy) itself in other programs on its host machine or (via the Internet) to new host machines. This copying and sending takes up resources on the original machine, on the Internet's communications capacity, and on any new machines infected. For a major virus attack, the loss of resources can cost billions of dollars.
One variation on the more traditional application-borne computer virus is the e-mail virus. An e-mail virus attaches itself to a piece of e-mail instead of to a program. Another subspecies of computer virus is the "logic bomb." A logic bomb is a virus because it resides inside the operating system or an application; the variation is that a logic bomb executes its harmful side effect only when certain conditions are met, typically when the system clock reaches a particular date. At the appointed time, the virus can do something relatively harmless, like flashing a provocative text message on the screen; but it could also do something far more serious, such as erasing significant portions of the resident host's hard drive.
A virus requires a program or e-mail to hide in. But a computer worm works independently. A computer worm uses computer networks and security flaws to replicate itself on different networked computers. Each copy of the worm scans the network for an opening on another machine and tries to make a new copy on that machine. As this process is repeated over many generations, the computer worm spreads. As with viruses, both the propagation and any other side effects can be frivolous or draconian.
A Trojan horse is a complete computer program that masquerades as something different. For example, a web site might advertise a freeware computer game called Y. But when someone downloads and runs a copy of Y, Y erases the hard drive of the host machine. Unlike viruses, a Trojan horse does not include a self-replication mechanism inside its program.
Early in the history of computer development, some people thought of electronic infections as relatively harmless, high-tech pranks. But once these infections began to cost the public enormous amounts of time, energy, and money, they ceased to be laughing matters. The technical details that separate viruses, worms, and Trojan horses are useful distinctions when understanding the different techniques, but all infections share a common feature: They enter someone's computer without permission. Although different infections have different effects (and some claim to be benign), all of them take unauthorized control of another machine and/or memory.
In the early 1990s, there was actually some controversy about whether or not computer infections and other "hacking" activities were always unethical. In some instances benign infections simply used underutilized computer power in ways that did not compromise the owner's uses (Spafford 1992). But the reflective consensus in the early twenty-first century is that all infections and break-ins are wrong. Reasons for this consensus include the view that it causes real harms, it violates legitimate rights to non-intrusion, it steals resources that could be put to better use, and it encourages otherwise unnecessary spending on security that could be spent on better things (Johnson 2001).
Even when it is agreed that all computer infections are unethical, important questions remain. For example, most computer infections now known are aimed at Microsoft Corporation operating systems and applications. That may be a consequence of Microsoft's market share, of technical details about Microsoft's software, of hackers' attitudes toward Microsoft, or a combination of these. Each has ethical dimensions. When one condemns the creator of a harmful infection, should some of the blame for the damages not be shared by vendors who release software with security holes that are easily exploited? Are users who fail to install security updates or adopt easily broken passwords not partially responsible? Such questions are part of an ongoing discussion of responsibility that can be found in analyses of the degrees of victim contributions and extenuating circumstances with regard to a wide range or crimes, from fraud to theft and assault and battery.
Consider also questions raised by teaching students about computer infections. Those offering such classes defend their actions as helping students learn how to defend against such infections; critics have argued that such classes may actually encourage students to write and propagate new infections.
Both the defenders and the critics of academic work on computer infections raise legitimate issues. Considering their positions consequentially, if such classes reduce the number and severity of infections, then they are morally justified; conversely, if they increase the number or severity of infections, then they are not justified. But it seems unlikely that enough information about consequences can be easily gathered to settle the question.
Another approach is to analyze classes that teach about computer infections in terms of course content. Surely it would be noncontroversial to teach historical facts about the occurance and severity of computer infections. Furthermore, discussing the ethics of computer infections and other attacks are also unlikely to raise objections. The content most likely to prove objectionable would be teaching the technical details of how to construct computer infections, with assignments that require students to design new infections.
Is it ethical to teach the technical details of computer infections? Consider an analogy: Is it ethical to teach accounting students the details of accounting fraud? Such classes exist and have not elicited the same kind of criticism that has been leveled against computer infection classes. It seems reasonable in both cases that professionals in the field should know how people have conducted "attacks" in order to detect and defend against them in the future.
Yet there are ethically significant differences between accounting and computing—the rules of proper accounting are more explicitly spelled out than the rules of "proper computing." Accountants are held to more formal, legal, and professional standards than computing professionals. Furthermore, it takes very little advanced skill to launch a computer attack, but it requires some sophistication (and often a high position in a company) to launch a major accounting fraud. Finally, although an accounting class might include the study of strategies to defraud a company, it seems unlikely that a student could actually implement a fraud during the class, whereas computer science students can indeed launch a computer virus (and some have).
The analogy suggests that the notion of teaching computer science students about computer infections seems reasonable, but that some cautions about what is taught and how it is taught may be necessary. There is no airtight case for or against classes that include details of computer infections, but there are two important perspectives to consider: consequentialist arguments and arguments from analogy. Other perspectives might include deontological obligations to share knowledge or to recognize traditions of forbidden knowledge, and the character or virtue implications for both teachers and students in such classes.
Preliminary explorations nevertheless suggest that the content of the courses and the context in which technical details are presented will determine whether or not such courses are ethical. One can envision a course in which a professor does not emphasize the responsibilities of a programmer and does not discuss the negative impact of computer infections; in such a class, the presentation of technical details of computer infections are likely inappropriate. One can also envision a course in which professional responsibilities and public safety are central themes; in such a course, details of computer infections might be entirely appropriate.
|Minor consequences to others||Major consequences to others|
|source: Courtesy of Keith W. Miller and Carl Mitcham.|
|Unintended||Education||Education plus minor punishment|
|Intended||Education plus minor punishment||Education plus major punishment|
If infecting systems that don't belong to you is wrong, it is necessary to consider appropriate sanctions against those who create and launch computer infections. In general, punishments for any unethical behavior should take into account both consequences of the act and intentions of the actor. Table 1 shows a broad view of how sanctions could be applied using considerations of intent and consequences.
Unintended minor consequences (as when a person experimenting designs a virus to see how it works and accidentally lets it get away, but it does very little damage) surely deserves little in the way of punishment, although some acknowledgement of the damage done seems appropriate. Unintended major consequences and intended minor consequences both deserve education plus some form of punishment, although probably not the same in each case. But intended major consequences could be assigned significant punishments, including jail and restrictions on future computer use.
The computer software community of hackers also has responsibilities to exercise social pressure and the punishment of ostracism on intentional offenders. Indeed, to some extent it seems to do this by reserving the pejorative term crackers for such persons. But professional organizations such as the Association for Computing Machinery might also instigate formal forms of ostracism. Codes of ethics for computing professionals such already include explicit prohibitions against computer attacks. For example, section 2.8 of the ACM Code of Ethics states: "Access computing and communication resources only when authorized to do so" (ACM Internet site). However, the ACM rarely disciplines members, and removal from the ACM is not seen as a significant threat to most hackers.
KEITH W. MILLER
Johnson, Deborah G. (2001). Computer Ethics, 3rd edition. Upper Saddle River, NJ: Prentice Hall. A groundbreaking computer ethics textbook.
Spafford, Eugene H. (1992). "Are Computer Break-ins Ethical?" Journal of Systems and Software 17(1): 41–48. An relatively early work on the ethics of hacking by an expert in Internet security.
Association for Computing Machinery (ACM). "ACM: Code of Ethics." Available from http://www.acm.org/constitution/code.html#sect2. An influential code of ethics in the United States.
Brain, Marshall. "How Computer Viruses Work." HowStuff-Works. Available from http://computer.howstuffworks.com/virus1.htm.
Carnegie Mellon University. CERT Coordination Center. "CERT/CC Overview Incident and Vulnerability Trends," Module 4: "Types of Intruder Attacks." Available from http://www.cert.org/present/cert-overview-trends/. CERT is an acknowledged authority on computer security, and is often involved in quick reactions to Cyberspace threats.