Biomedicine and Health: Blood

views updated

Biomedicine and Health: Blood

Introduction

For thousands of years people put the seat of individuality in the heart; in the modern West we locate it in the brain. By contrast, blood has long been designated the place where group identity resides. For ancient peoples, and many modern ones, blood is the fluid that binds together family, clan, order, race, and even nation. It is said to be “thicker than water,” and noble blood is proverbially blue. Blood is deeply connected to breeding and lineage. Breeders of racehorses talk of bloodlines as a geneticist would speak about DNA.

Blood is prominent in the earliest written records. The Sumerians (c.3000 BC) developed a pictogram for it, and the Egyptian Ebers papyrus (c.1600 BC) described the belief that the heart turned food into blood. Ancient peoples seemed to have debated whether the heart or blood—or both—were the seat of life (no one gave much thought to the brain). The Hebrews seemed to have voted for blood and prohibited eating or drinking of it. A passage in Genesis that forbids eating blood is today interpreted by Jehovah's Witnesses as a prohibition against receiving a blood transfusion.

Historical Background and Scientific Foundations

Body fluids—humors—had far more importance in ancient medical writings than solid body organs. This is hardly surprising: They seem a natural measure of health. Every day we take in fluids and pass urine and sweat. If we are sick, the most obvious signs are disturbances of the fluids: In the summer we may have diarrhea, and in the winter we might cough up runny phlegm. The works associated with the Greek physician Hippocrates (c.460-c.370 BC), although clearly from many different authors and sects, use humors extensively to explain health and disease.

The main humors appearing in the Hippocratic texts—blood, phlegm, black bile, and yellow bile—were codified into a theory of medicine by Galen of Pergamum (AD c.129–c.216), a Greek physician who practiced in Rome. Humoral theory served medicine for 1,400 years, both in Islam and the West. Humors were associated with seasons, age, sex, constitution, and many other things, and blood predominated in young men and in the spring. When Tennyson wrote “In the spring a young man's fancy lightly turns to thoughts of love,” he was merely expressing an ancient sentiment associated with rising sap or a surge of blood. Bleeding or bloodletting (also known as venesection or phlebotomy) is an ancient practice. It was used in the West until relatively recently and is still widespread in some non-Western cultures.

Ancient medical ideas of the humors (particularly blood) began to be challenged during the scientific revolution of the seventeenth century. The fluid so obviously essential to life became a major object of speculation and experiment, especially after English physician William Harvey (1578–1657) published Exercitatio Anatomica de Motu Cordis et Sanguinis in Animalibus (An anatomical exercise concerning the motion of the heart and blood in animals) in 1628. Interest was also fueled by the growth of chemistry and the new corpuscular (atomistic) philosophy.

In Oxford and London a circle developed around the Earl of Cork, Robert Boyle (1627–1691), an enthusiast for mechanical and chemical inquiries. These investigators used the newly invented air pump to study the functions of blood. The circle included physicians Robert Hooke (1635–1703), curator of experiments at the newly established Royal Society, and Richard Lower (1631–1691). Together they conducted experiments on dogs, exposing their lungs and artificially ventilating them with bellows. Lower, a passionate supporter of the idea of circulation, reported that blood changed color from dark blue to bright red as it passed through the lungs, not in the heart as was previously believed. A younger collaborator of Lower, chemist and physiologist John Mayow (1640–1679), on the basis of experiments and chemical theory, postulated that “elastic” particles in the air were necessary to support combustion and life and were taken into the blood during respiration.

The most remarkable and daring exertions of this band of virtuosi were those involving blood transfusion. Their experiments confirmed that the scientific revolution was constituted as much by a quest for practical results as it was by disinterested inquiry. Their work was provoked by the studies and writings of another member of the group, Sir Christopher Wren (1632–1723), architect of St. Paul's cathedral. Wren had injected various fluids into the veins of animals and observed the effects. In 1665 Lower used hollow quills to transfuse blood from one dog to another without apparent adverse effects. In 1667 he transfused blood from a sheep to a mad clergyman, Arthur Coga, testing the theory that blood from a “gentle” animal like a lamb might quiet a troubled and disturbed spirit. The patient reported himself “very well.” However, a similar but highly controversial experiment in France, after which the patient died (whether or not from transfusion was and is contested), resulted in a ban on further trials until the nineteenth century.

After the failed blood transfusion experiment in the seventeenth century, there were no further recorded attempts until 1817, when a London obstetrician, James Blondell (c.1790–c.1877), sought a treatment for the often-fatal hemorrhage that could occur following childbirth. He began to experiment with transfusion between dogs, but insisted that only human blood should be used on human patients. He attempted transfusion on ten very sick women. Two were so ill that nothing availed, but five of the remaining eight recovered (of course we cannot be sure this was owing to transfusion). After this, numerous doctors, notably surgeons, tried transfusion with mixed success.

Blood's Physical Properties

Of importance for future studies of blood was the rise of gas chemistry. From the middle of the seventeenth century onward, common air was analyzed, separating it into a number of constituent gases. Other gases were generated experimentally. At the forefront of this work were the English dissenting minister Joseph Priestley (1733–1804) and the wealthy French chemist Antoine Lavoisier (1743–1794). By the end of the century Lavoisier's innovative chemical nomenclature had swept all before it and his newly named “oxygen” was made an integral constituent of the investigation of blood. Chemists quickly saw in the study of gases a fast lane to understanding the blood's respiratory functions.

The nineteenth century saw the beginning of a two-pronged scientific study of blood that has continued ever since. Both inquiries were underpinned by new laboratory sciences. Side by side, physicochemical analysis and biological investigation broke blood into myriad constituents and numbered its countless functions. Nonetheless, scientists failed to demystify it.

Blood is unlike other tissues because of its fluid, mediating role between the environment and the interior of the organism. Ultimately all studies of blood seek to comprehend how this mediation is effected. In the nineteenth century, the physicochemical approach built on the foundation of eighteenth-century gas chemistry and centered on blood's role in respiration. Much of this work was done in Germany, where physiologists readily embraced the knowledge and technologies of the new professional disciplines of physics and chemistry. In 1838 the iron-containing compound hematin was isolated from blood. Over the next two decades hemoglobin was crystallized, and in 1864 Ernst Felix Hoppe-Seyler (1825–1895), a physiological chemist in Germany, demonstrated that hemoglobin could be split into hematin and a protein.

By the beginning of the twentieth century the hybrid discipline biochemistry had been created, and it soon became the premier science of the blood's physicochemical properties. In many ways biochemists still approach blood in the same way as their nineteenth-century predecessors. Rather than being limited to respiration, however, modern studies investigate blood's role in keeping the internal environment of animals constant in the face of an ever-changing external world. This capacity to regulate the internal environment was named homeostasis by the American neurologist Walter Bradford Cannon (1871–1945), building on the studies of the French physiologist Claude Bernard (1813–1878).

In one way or another, the huge array of studies encompassed by biochemistry bear on the blood's mediating role. These studies include investigation of the blood's electrical, optical, and thermal properties, iron metabolism, respiratory heme pigments, biosynthetic pathways, enzymes, metabolite transport, maintenance of acid-base balance, and so on. In the biochemical laboratory of a modern hospital, a vast range of substances can now be measured to indicate not only the condition of the blood itself but the state of every organ in the body. The “blood test” has the sort of status in medicine today that bloodletting did 200 years ago. Forensic investigation of blood's physical properties has also played an important part in decoding the “blood spatter” produced by criminal violence.

Cell theory was first proposed by microscopists in Germany in the 1840s; with it, red blood cells (erythrocytes) and many varieties of white cells were described. (Microscopic red discs had been seen in the blood since the seventeenth century, but without a unifying theory their nature was open to all sorts of speculation). Leukemia (an excess of white cells in the blood) was one of the first cellular diseases to be described. An important account of the disorder was given by Rudolph Virchow (1821–1902), a German pathologist who, in 1856, also named the condition.

It was, however, modern germ theory, created in the 1880s (again largely in Germany), that brought blood to the fore as a biological substance. Hardly had that theory been accepted than the problems of resistance and immunity to bacterial attack raised their heads. Working in Germany, bacteriologists Emil von Behring (1854–1917) and Shibasaburo Kitasato (1853–1931) injected various laboratory animals (mice, rabbits, guinea pigs) with nonlethal doses of the toxin produced by the tetanus bacillus. They found that blood serum from these animals could protect other animals of the same species from otherwise lethal doses of the toxin. Some animals, under the right circumstances, could produce substances that appeared in their blood (antitoxins) that could neutralize the most vicious bacterial poisons. In 1891–1892, using this principle, von Behring successfully treated children suffering from diphtheria with antiserum produced in horses.

Not long after this, bacteria and their products were cultivated, attenuated, and made into prophylactic vaccines—agents that are injected to protect people from diseases such as typhoid and tetanus. In this case, as with curative sera, crucial transformations seemed to have taken place in the blood. Scientists noticed too that injection with horse sera sometimes produced very nasty (anaphylactic) reactions. Later, disorders such as hay fever and asthma were seen to produce similar responses. What was to become the modern massive discipline of immunology was being created in the early twentieth century, and its territory was the blood.

At the beginning of the century two theories of long-term immunity and its immediate response to bacterial invasion and toxin production were propounded. A humoral theory placed resistance in the serum. The most famous supporter of this view was German medical scientist Paul Ehrlich (1854–1915). The cellular theory put forward by the Russian zoologist Élie Metchnikoff (1845–1916) proposed that a special sort of white cell in the blood could ingest alien material. These cells, he said, were phagocytes (from the Greek phagein, “to eat”). The theories were resolved by the view that both sorts of immunity existed and, indeed, complemented one another.

There was yet another piece of laboratory work in the early twentieth century that contributed to the explosion of confusing observations about blood. It was a study that turned out to have major practical consequences. In 1900, in Vienna, Austrian immunologist Karl Landsteiner (1868–1943) mixed blood cells and sera from different individuals in his laboratory. In some cases the cells clumped together (agglutination), in others they did not. Landsteiner and others then divided human blood into four groups, A, B, AB, and O, on the basis of antigens contained in the red cells.

IN CONTEXT: BLOOD TYPES

According to the American Association of Blood Banks (AABB), about eight million volunteer donors donate the (approximate) 13 million pints of blood used in the United States each year. The blood is used to help a variety of people. Donated blood can help restore a person's blood volume after surgery, accident, or childbirth; improve the immunity of a patient suffering from cancer or leukemia and other diseases; and improve the blood's ability to carry oxygen.

A sample of the donated blood is taken for testing. It is checked for infectious diseases like AIDS (acquired immunodeficiency syndrome) and syphilis, for anemia, and, if the blood type is not already known, for blood typing. Human blood falls into three major groups, A, B, and O; the types get their names from certain molecules found on the surface of the red blood cells. If a person receives a donation of an incompatible blood type, the blood cells can clump together, a dangerous and possibly fatal situation. Type O blood can be received by persons with A, B, or AB blood (which is why type O is sometimes called the “universal donor”), but a person with Type O blood can only receive Type O blood. It is also important to match the Rh factor of the blood, which can be positive or negative.

In 1940 Landsteiner discovered a subdivision of these groups, the rhesus or Rh factor. This is a specific agglutinating substance, a protein found in rhesus monkey blood and that of most humans—at least 85% of the world's population. Those with the protein were labeled Rh positive; those lacking it were called Rh negative. Giving Rh positive blood to an Rh negative individual—whether a laboratory animal or a human being—produced severe agglutinating reactions.

The knowledge of blood groups would prove vital to making transfusions work. In addition, discoveries of blood's other physical properties lead to knowledge critically important for the use of blood transfusion on a fairly wide scale. One major problem with transfusion was resolved at the beginning of World War I (1914–1918) when it was discovered that sodium citrate maintained the blood's fluidity outside the body. These discoveries facilitated the use of transfusions in a variety of settings.

In 1937 Cook County Hospital in Chicago established a blood bank where citrated, refrigerated blood was stored for up to ten days. The onset of World War II (1939–1945) saw the creation of many such banks and the mobilization of donors on a massive scale. Most donors then were volunteers; this remains so today. In 1940 a program for enrolling volunteers was begun in New York; by the end of the war 13 million units of plasma, which has a much longer shelf life than blood, had been shipped. Once the hostilities ceased, surgeons back in civilian life regarded blood banks as a medical essential and no longer a luxury.

IN CONTEXT: BLOOD BANKS

In 1941, when the United States first entered the war, it was felt that blood plasma, rather than whole blood, would be the most appropriate treatment for soldiers suffering from shock caused by blood loss. However, it became apparent that British physicians were saving more lives through the use of whole blood transfusions. By 1944, whole blood was being transported by air from the United States to help soldiers in the European and Pacific theaters of the war.

Thus, in response to the demands of the world war, the U.S. military had built the first national blood donation program. In all, more than 13 million pints of whole blood were drawn by the American Red Cross during World War II for direct use or for the preparation of plasma. Only some of this was used by the military, and the rest was diverted to civilian purposes. After the war the national blood program declined, and it was rebuilt as a localized civilian program.

The United States entered the Korean War in 1950 without a national blood program in place, and no blood was shipped to Korea during the first seventy days of the conflict. A program was assembled in due course, drawing on the experience of World War II, and 400,000 units were used in the next three years. A certain amount of blood was wasted, which led to the development of plastic blood bags for better storage.

Beginning in 1965, the Vietnam war required a military blood program for almost ten years, and many useful lessons were learned from the experience. For instance, studies sponsored by the military enabled whole blood to be stored for four or five weeks, depending on the citrate formula used. Packed red blood cells and fresh-frozen plasma also were used for the first time. Frozen red blood cells also were used briefly in Vietnam and were re-introduced in the Gulf War of 1990–1991. Blood also has been used in more recent conflicts such as those in Bosnia, Kosovo, and Iraq.

Experience shows that blood is used less efficiently in a war situation than it is for civilian purposes. For instance, in the Bosnian War, 5,600 units of red blood cells were supplied, but only 79 were used. The remainder was not all wasted—some was given to Bosnian hospitals in desperate need, and the U.S. civilian supply did not suffer. In the Gulf War, there were fewer casualties than had been anticipated, so much of the blood requisitioned was never used. War is, inevitably, unpredictable. Over the last century, medical experience in military conflicts has led to the establishment of a national blood banking system that not only covers the uncertainties of war but also provides for the needs of the civilian population.

Modern Cultural Connections

Since then, collection and storage have changed considerably. Preservation has been improved, thanks especially to the plastic bags launched in 1950. New anticoagulants have been introduced, and different components, such as platelets, can now be separated and stored. Freezing red blood cells can extend their shelf life to ten years. In 1947 the American Association of Blood Banks (now the AABB) was formed, and today AABB facilities collect and store nearly all of the nation's donated blood.

The post-Second World War era saw the emergence of unimagined problems associated with blood transfusion, notably the transmission of infectious diseases, including syphilis, hepatitis B and C, and HIV (human immunodeficiency virus). Many countries now screen donated blood intensely to detect such pathogens, but in areas without these standards—rural China, for example—large-scale outbreaks of HIV have occurred.

Although the study of blood has been a medical specialty since the 1920s—hematology—it has become a subject of universal significance in the life sciences. Unlike an organ that performs only one or two functions, blood is unique. It is the river through which all the body's activities flow and is therefore of interest to scientists in every discipline. In this sense the huge amount of twenty-first-century research devoted to blood is simply a massive scale increase of the sort of work carried out 100 years ago when blood was assigned the place it still has in biological systems.

Immunology is perhaps the most prominent modern science centered on blood, notably because of immunological problems associated with organ transplants. Molecular biologists have found a gold mine for research in the blood's complex proteins. The reticulocyte (immature red blood cell) has been a model for protein synthesis. The structure of hemoglobin has proved a source of endless fascination and research. But other scientists have been drawn to study blood as well. Geneticists work on the inheritance of blood groups and the genetics of conditions such as high blood pressure and hemophilia; the latter also attracts molecular biologists and hematologists who are experts in clotting mechanisms. In the field, students of evolution and human populations use blood groups as markers.

Blood continues to have significant racial dimensions. Despite laboratory analysis and reduction to objective elements, ancient ideas of its associations with life and race persist in both popular and scientific culture. Belief in vampires—reanimated corpses that live on human blood—can be traced deep into history. Bram Stoker's (1847–1912) 1897 novel Dracula hit a resonant chord with the Victorian public; versions of it have been in wide circulation ever since.

In popular and serious scientific thought after World War I (1914–1919), blood was deeply associated with race, a powerful, emotive term that pervaded eugenic rhetoric. In 1924 Robert Allen, a Democratic congressman from West Virginia declared: “The primary reason for the restriction of the alien stream … is the necessity for purifying and keeping pure the blood of America.” The Nazis consistently identified racial purity with “blood” (literally and metaphorically) of Germanic people of so-called pure Ayran (Nordic) descent. Any other “blood” was regarded as polluting.

IN CONTEXT: THE IMPACT OF AIDS ON BLOOD BANKING

When acquired immune deficiency syndrome (AIDS, also known as acquired immunodeficiency syndrome) was first recognized in the early 1980s, recognition took precious time. Much was unknown about the disease, and even when the cause was linked to a virus, no test existed to detect the virus in blood. In initial reports to the CDC, all of the young men with both Pneumocystis pneumonia and Kaposi's sarcoma were actively homosexual, and, early on, the task force considered the disease likely to be confined to the community of homosexual males. By the end of 1981, it became clear that the newly recognized disease affected other population groups, as the first cases of Pneumocystis pneumonia were reported in drug users who injected their drugs. It also became clear that the disease was not confined to the United States when similar cases were found within a year in the United Kingdom, Haiti, and in Uganda, where the disease was already known as “slim.”

Members of the medical community ultimately determined that many of these patients had contracted the disease from donated blood, and that the process used to break blood into its components did not kill what became known as the human immunodeficiency virus (HIV) that causes AIDS. Because of its minute size, the HIV virus passed through the filters used in the extraction process.

In 1985, the U.S. Food and Drug Administration ordered testing of the national blood supply and required that anyone testing positive for the virus would not be allowed to donate blood. Now that the cause of AIDS could be detected, public bewilderment over AIDS transmission gave way to concern over the dissemination and use of information about infection. The gay community voiced fears of stigmatization of persons found to carry the virus, believing the information would be misused by employers and insurance companies to exclude infected individuals. Incidents of cruelty and prejudice directed toward AIDS victims and perceived risk groups continued to mount, though Haitians were removed from the list of high-risk groups in view of new understanding of heterosexual and injection drug transmission risks. The year 1985 ended with more than 20,000 reported U.S. AIDS cases, with over 15,000 cases reported in other nations.

Blood donors are now carefully screened to eliminate any who might be at high risk of contracting AIDS. To eliminate the possibility of contracting AIDS and other blood-borne diseases, patients who are scheduled to undergo surgery are urged to donate blood ahead of time so that their own blood can be transfused into them if needed.

Today the study of sickle-cell anemia brings together blood, genetics, and race. The disease was first described in 1910 when a Chicago physician, James B. Herrick (1861–1954), described sickle-shaped red corpuscles in an African-American patient's blood. It was soon recognized as a hereditary disorder and thought to be carried by a dominant gene, raising fear among some whites that interracial marriage would further disseminate the condition. It was cited as a disorder of “Negro blood,” a phrase then in everyday use in biological, social and public health literature. Conceptions of the disease as carried by a dominant gene chimed perfectly with the rediscovery of Mendelian genetics and eugenic sentiments. After World War II, however, sickle-cell anemia was found to be a recessive disorder manifested in molecular abnormality and thus could not be spread by a single individual.

Although this reevaluation was based on the use of a new medical technology—electrophoresis—it has been suggested that this reconstruction of its identity was as much sociopolitical as it was clinical and technological. Molecular biologists are a powerful community. They can inform social policy and influence spending on health. This new view of sickle-cell disease signaled the decline of the doctor as “race detective” and the rise of the “molecular engineer.”

See Also Biomedicine and Health: Dissection and Vivisection; Biomedicine and Health: Galen and Humoral Theory; Biomedicine and Health: Hormonal Regulation of the Body; Biomedicine and Health: Human Gross Anatomy; Biomedicine and Health: Immunity and the Immune System.

bibliography

Books

Seeman, Bernard. The River of Life: The Story of Man's Blood from Magic to Science. London: Museum Press, 1962.

Wailoo, Keith. Drawing Blood: Technology and Disease Identity in Twentieth-Century America. Baltimore and London: The Johns Hopkins University Press, 1997.

Weatherall, David. Science and the Quiet Art: Medical Research and Patient Care. Oxford: Oxford University Press, 1995.

Wintrobe, Maxwell M. Blood, Pure and Eloquent: A Story of Discovery, of People and of Ideas, New York, McGraw-Hill Book Company, 1980.

Christopher Lawrence

About this article

Biomedicine and Health: Blood

Updated About encyclopedia.com content Print Article

NEARBY TERMS

Biomedicine and Health: Blood