DNA Testing, Fingerprints, and Polygraphs
DNA Testing, Fingerprints, and Polygraphs
DNA Testing, Fingerprints, and Polygraphs
Sections within this essay:Background
Comparing the Techniques
Fingerprints: The First ID
DNA: Greater Accuracy
DNA As An Exoneration Tool
American Polygraph Association
Federal Bureau of Investigation
The Innocence Project
National Institute of Justice (U.S. Department of Justice)
Viewers who watch police investigation shows on television often see intrepid experts solve crimes with the aid of a fingerprint on a doorknob or a strand of DNA from a hair miles from the crime scene. While the ease with which criminals are identified is exaggerated, the general picture is correct: Both DNA and fingerprints can help identify individuals through their unique markers—which means they can be useful tools both in identifying criminals and in clearing those who have been wrongly accused.
Polygraph machines, better known as "lie detectors," are also seen on television, but usually they can be found on older programs. The polygraph does not actually measure whether a person has made a true or false statement; in fact, it measures changes in breathing, blood pressure, and perspiration. A person who is lying, claim polygraph proponents, will breathe more rapidly, have a faster heartbeat, and sweat more profusely than one who is telling the truth.
All three of these tools have an established place in the criminal justice system, as well as other areas of society. When determining a person's culpability in committing a crime, law enforcement experts agree, the more corroborating evidence, the stronger the case. Even if a person confesses and witnesses to the crime come forward, having indisputable physical evidence helps guarantee that the right person will be called to account for the crime.
Fingerprints are the oldest and most accurate method of identifying individuals. No two people (not even identical twins) have the same fingerprints, and it is extremely easy for even the most accomplished criminals to leave incriminating fingerprints at the scene of a crime.
Each fingerprint has a unique set of ridges and points that can be seen and identified by trained experts. If two fingerprints are compared and one has a point not seen on the other, those fingerprints are considered different. If there are only mathing points and no differences, the fingerprints can be deemed identical. (There is no set number of points required, but the more points, the stronger the identification. Fingerprints can be visible or latent; latent fingerprints can often be seen with special ultraviolet lights, although on some surfaces a simple flashlight will identify the print. Experts use fingerprint powder or chemicals to set a print; they then "lift" the print using special adhesives.
The use of fingerprints for identification goes back to ancient times. In ancient Babylonia and China, thumbprints and fingerprints were used on clay tablets and seals as signatures. The idea that fingerprints might be unique to individuals dates from the fourteenth century. In 1686 the physiologist Marcello Malpighi examined fingerprints under a microscope and noted a series of ridges and loops. In 1823, another physiologist, Jan Purkinje, noted at least nine different fingerprint patterns.
The pioneer in fingerprint identification was Sir Francis Galton, an anthropologist by training, who was the first to show scientifically how fingerprints could be used to identify individuals. Beginning in the 1880s, Galton (a cousin of Charles Darwin) studied fingerprints to seek out hereditary traits. He determined through his studies not only that no two fingerprints are exactly alike, but also that fingerprints remain constant throughout an individual's lifetime. Galton published a book on his findings in 1892 in which he listed the three most common fingerprint types: loop, whorl, and arch. These classifications are still used today.
It did not take long for law enforcement officials to recognize the potential value of fingerprint evidence. Sir Edward Richard Henry, a British official stationed in India, began to develop a system of fingerprint identification for Indian criminals. (Henry created 1,024 primary fingerprint classifications.) In Argentina, Juan Vucetich, a police official, also used Galton's findings to create a fingerprint system. (He used Galton's research to make a fingerprint identification of a murderer in 1892.) By the beginning of the twentieth century, Scotland Yard had begun to compile fingerprint information, using a classification system based on Henry's work and creating a Central Fingerprint Bureau. In the United States, the New York Police Department, the New York State Prison System, and the Federal Bureau of Prisons instituted a fingerprint system in 1903, and in 1905 the U.S. Army began using fingerprint identification.
The first murder case in the United States in which fingerprint evidence was used successfully was in Illinois in 1910, when Thomas Jennings was accused of murdering Clarence Hiller after his fingerprints were found at Hiller's house. Jennings appealed his conviction, but the Supreme Court of Illinois upheld the evidence in 1911 and Jennings was executed in February 1912. People v. Jennings thus established fingerprint evidence as a reliable standard.
The Federal Bureau of Investigation (FBI) established a fingerprint repository through its Identification Division beginning in 1924. This repository held fingerprint cards in a central location. Over the next 50 years the FBI processed more than 200 million fingerprint cards. To eliminate duplicate fingerprints and make it easier to store and share fingerprints among law enforcement agencies, the FBI developed the Automated Fingerprint Identification System (AFIS) in 1991, which computerized the card system. The Integrated AFIS system (IAFIS) was introduced in 1999; a law enforcement official can request a set of criminal prints from IAFIS and get a response within two hours.
Fingerprints are kept for criminals, but civil fingerprints are also kept. People who apply for government jobs, jobs that handle confidential information, banking jobs, teaching jobs, law enforcement jobs, and any job that involves security issues can be fingerprinted. IAFIS stores civil prints as well as criminal prints.
The word polygraph comes form the Greek for "many writings." The polygraph machine measures physiological information from the body: breathing, blood pressure, and perspiration. The faster the breathing, the higher the blood pressure, and the greater the amount of sweat, the more likelihood the person being tested is nervous.
Although it had been suggested in the nineteenth and early twentieth centuries that physiological changes could help determine whether a person was telling the truth, the first serious effort to apply this information came in 1920 when John Larson, a police officer in Berkeley, California, developed a device (which he called a polygraph) that could measure breathing and blood pressure. Larson believed that his invention could help determine whether a suspect was telling the truth. When the results of a polygraph test were included as evidence in a criminal case in 1923, they were challenged, and the D.C. District Circuit Court ruled in U.S. v. Frye that polygraph evidence needed to meet three criteria to be accepted: (1) that the general scientific community must acknowledge the test's reliability, (2) that the person conducting the test must be qualified to do so, and (3) that it can be proven that correct procedures were followed. Known as the "Frye test," it remained the judicial standard for 70 years.
During that time, scientists worked at refining Larson's invention. Leonarde Keeler, who had worked with Larson, began developing more sensitive polygraph machines in the 1930s, even starting a polygraph school in 1948.
Through the years, polygraphs were used by law enforcement agencies, but they were not considered definitive. To begin with, the person who is hooked up to the polygraph would already be quite nervous, and to have tubes placed on the chest, a blood pressure cuff on the arm, and metal plates on the fingers would not relax most people. Moreover, there is a difference of opinion on the accuracy of polygraph tests. The American Polygraph Association has stated that inconclusive polygraph results are not the same as incorrect results. Yet typically inconclusive readings are figured in with incorrect ones when establishing a percentage of accuracy.
Polygraph experts continued to fine-tune the machines, and also developed a questioning technique that was intended to produce fewer incorrect readings (the subject is asked to respond "yes" or "no" to questions, and unrelated questions are mixed in with relevant ones; this is meant to eliminate nervous affect).
In 1975, federal judges were given more discretion about the admissibility of evidence under new "Federal Rules of Evidence." Thus, a judge could allow a jury to consider polygraph results even if they did not pass the Frye Test. In 1993, the U.S. Supreme Court issued an opinion on Daubert v. Merrell Dow Phramaceuticals that definitively replaced the Frye standard. The court said that judges could admit certain scientific evidence as long as the theory behind it could be been tested, it had been subject to peer review and publication, the potential error was known, and the scientific community in general accepted the theory. In the 1998 case of U.S. v. Scheffer, the U.S. Supreme Court ruled that polygraph tests did not have to be admitted as evidence in military trials. (President George H.W. Bush had banned the admission of polygraph evidence from military trials in 1991, citing their unreliability.) But it did not ban polygraph evidence outright. Daubert grants judges the right to determine whether polygraph evidence can be used or ignored, so it is generally up to the judge.
The polygraph has also been used to pre-screen job applicants or to test employees to measure their truthfulness about such issues as drug use or theft. In 1988 Congress passed the Employee Polygraph Protection Act (EPPA), which prohibited business from using polygraph evidence to pre-screen employees or to test current employees, and which prohibited companies from disciplining or firing employees solely for failing a polygraph test. (Polygraphs can be used if an employer can show other evidence against an employee, but the employee still has the right to refuse.) EPPA does not apply to government workers.
The use of DNA (deoxyribonucleic acid) as a method of identification is relatively new, but it has proven an effective means of identifying criminals—and perhaps more important, eliminating people as crime suspects. A fingerprint is the only unique identification source (identical twins have the same DNA). But if a criminal leaves no prints behind, law enforcement officials must rely on minute DNA samples from blood, saliva and other bodily fluids, hair, or skin. DNA testing is also used in paternity disputes to determine the identity of the actual father in custody, inheritance, or child support suits.
DNA testing can be done by standard techniques such as restrictive fragment length polymorphisms (RFLP), polymerase chain reaction (PCR), short tandem repeat (STR), and mitochondrial analysis. In RFLP testing, a DNA sample is mixed with a chemical substance that helps examiners isolate and identify specific key fragments of the sample that can be used in comparison analysis. A drawback of RFLP is that it requires a fairly large DNA sample. With PCR, a series of chemical reactions helps generate copies of a minute DNA sample, thus amplifying a small or degraded piece of information. In STR, various DNA regions in a sample are compared with other samples for similarities. The FBI uses STR using special software that can identity thirteen of these regions in a DNA sample. Mitochondrial DNA analysis is often used for extracting samples from bones and teeth, for which the other methods are not effective.
The FBI keeps a computerized databank of DNA samples called CODIS (Combined DNA Index System), which contained about 1.7 million DNA profiles as of 2003. The profiles stored in CODIS can be used to convict criminals, and also to exonerate innocent people. There are numerous examples of criminals whose DNA matched a profile from an earlier crime and who were then charged with the crime; likewise, there are examples of individuals whose innocence was confirmed when DNA found at a crime scene turned out to belong to another person identified through the profiles.
Not oly can DNA be used to convict criminals, it has successfully been used to exonerate individuals, some of whom were wrongly imprisoned for more than two decades.
Often, the person who is wrongly convicted of a serious crime such as murder or rape has a criminal record for petty crimes, which means a record already exists. These individuals are frequently convicted on eyewitness testimony, but without any physical evidence tying them to the crime.
The Innocence Project, created in 1992 by Peter Neufeld and Barry Scheck at the Benjamin Cardozo School of Law in New York, works to exonerate people by use of postconviction DNA, in which DNA from the crime scene is tested against the accused's DNA. Often, physical evidence from a crime is kept for many years. If the evidence includes samples of blood, hair, skin, or other evidence that can include DNA, it can often be used to prove that the person accused could not have committed the crime. Morover, if it turns out that the DNA matches a profile in a database such as CODIS, the real criminal can be located and tried. From 1992 to the beginning of 2006, the Innocence Project helped exonerate 173 prisoners.
Opponents of capital punishment have pushed for DNA testing to be used more regularly, and many of those who favor capital punishment agree that those convicted for a capital offense should be allowed to make use of all evidence. One of the fears that come with capital punishment is that the wrong person could be executed for a crime. A case involving a many who was executed in 1992 gained national attention in 2005 when Governor Mark Warner of Virginia ordered DNA testing on a 24-year-old DNA sample to determine whether Roger Keith Coleman had murdered his sister-in-law in 1981. Coleman had proclaimed his innocence, and although his DNA had been tested before his execution, lawyers said the examiner might have misinterpreted the results. Using more advanced technology, Coleman's DNA was tested in January 2006, and the results confirmed that he was in fact the killer. Although supporters of capital punishment said that claims of the death penalty's fallibility were unfounded, but opponents noted that the danger of a wrongful execution still existed, and called for increased use of DNA as an identification tool.
Advances in Fingerprint Technology, Henry C. Lee and R.E. Gaensslen, eds., CRC Press, 2001.
DNA: Forensic and Legal Applications, Lawrence Kobilinsky, Thomas F. Liotti, and Jamel-Oesel-Sweat, Wiley-Interscience, 2005.
Fingerprints: TheOrigins of Crime Detection and the Murder Case That Launched Forensic Science, Colin Beaven, Hyperion, 2001.
"The Unrealized Potential of DNA Testing," Victor Walter Weedn and John W. Hicks, National Institute of Justice, 1998.
Lie Detectors: A Social History, Kerry Segrave, McFarland, 2004.
P. O. Box 8037
Chattanooga, TN 37414 USA
Phone: (423) 892-3993
Fax: (423) 894-5435
Primary Contact: Milton O. Webb, Jr., Executive Director
J. Edgar Hoover Building, 935 Pennsylvania Avenue NW
Washington, DC 20535 USA
Phone: (202) 324-3000
Primary Contact: Robert Mueller, Director
100 Fifth Avenue, Third Floor
New York, NY 10011 USA
Phone: (212) 364-5340
Fax: (212) 364-5341
Primary Contact: Peter J. Neufeld and Barry C. Scheck, Co-Directors
810 Seventh Street, NW
Washington, DC 20531 USA
Phone: (202) 307-2942
Primary Contact: Glenn R. Schmitt, Acting Director