The perception of blood's movement as regenerative is tied to complex cultural conceptions of blood itself. Blood has long been used as a metaphor for identity: family (‘blood is thicker than water’); class (‘blue blood’); race (‘black blood’); nationality (‘red-blooded American’); opposition (‘bad blood’). It has also been seen as the intangible carrier of the ‘vital spirit’ that animates animal bodies. Some Hippocratic physicians dubbed it the highest of the four humours, the formative substance of body and character. Similarly, blood's movement between bodies has carried and extended these meanings: a sharing of the self and an extension of life. Christians believe communion wine to represent (and, for Roman Catholics, to be) Christ's blood. Drinking it prepares the faithful to enter into eternal life and at the same time strengthens the communal Christian identity.
Early historyThe first recorded efforts to transfuse blood directly into living veins came in England in 1665, where Richard Lower transfused blood between dogs. In 1667, Jean-Baptiste Denis, of France's Académie des Sciences, successfully transfused lamb's blood into a human. It has been suggested these transfusions were conducted as much to see if science could correct the unhappy consequences of humanity's fall from Grace as to examine the empirical possibilities of blood's circulation. These experiments were cut short after one of Denis' patients died shortly after receiving a transfusion. A subsequent trial exonerated Denis, but banned transfusion without prior approval of the Académie de Médecine. The French Parliament, the Royal Society, and the Catholic Church subsequently issued general prohibitions against transfusion. For 150 years, it fell from orthodox medical practice.
Transfusion was reintroduced by the London physiologist and obstetrician, James Blundell, in 1818. Blundell conducted several transfusions between 1818 and 1834, many of which he considered to have been successful. It is probable that Blundell was inspired to attempt transfusion by his somewhat vitalistic ideas about blood and by a broader cultural interest in reanimation of the ‘apparently dead’ — evident in scientific movements such as galvanism and resuscitation, and in novels such as Mary Shelley's Frankenstein (1818). Blundell came to espouse two main guidelines for transfusion: it was only to be used on women near death from uterine haemorrhage, and only humans could serve as donors. Thus failures were often attributed to the patient being beyond the reach of medical intervention; successes were presented as dramatic resurrections. Further, human donors, unlike their seventeenth century animal predecessors, resisted having their arteries opened for attachment to the recipient's veins. Blundell devised apparatus for ‘indirect’ transfusion to move venous blood through cups and syringes (‘Impellors’ and ‘Gravitators’), and thence into the patient.
Indirect transfusion inevitably led to another problem: clots. Clotted blood gummed up the pipes of instruments and the veins of humans — to the detriment of both. In 1821, J- L. Prévost and J. B. A. Dumas, then working in Geneva, proposed defibrination as a solution. Defibrination entailed whipping the blood with a fork or twig so as to collect the fibrin on the whipping object and prevent the blood from coagulating as rapidly. Some, believing fibrin to be a mere waste material, seized upon the procedure, but others opposed it, convinced that fibrin was central to the formation of living tissue. Direct, or ‘immediate’, transfusion was proposed as a way to circumvent these problems. Suggested independently in London by J. H. Aveling and in Geneva by J. Roussel in 1864, immediate transfusion relied on india-rubber tubes and silver cannulae to carry blood, as it had in the seventeenth century, directly from donor to recipient.
In nineteenth-century Britain, transfusion was primarily the domain of obstetricians, though, from the 1870s, surgeons began to use it as well. By the 1880s, however, physiologists began to question the necessity of using blood to replace blood loss. Guided by blood pressure measurements and histological investigations, they increasingly saw the circulation in material terms: as an enclosed, fluid system that contained cellular parts. From this less vitalistic perspective, lost blood might be replaced by fluids that would refill the circulatory system while avoiding nasty coagulation. By the early twentieth century, blood transfusion had generally been replaced by saline infusion. In Britain, surgery textbooks referred to blood transfusion (once again) as a quaint relic of medical history.
Recognition of blood typesIt was at this historical moment, with transfusion distinctly out of medical favour, that the Viennese pathologist, and later Nobel prizewinner, Karl Landsteiner, was conducting his famous studies demonstrating that certain antibodies in human serum were not pathological, but normal. He showed that human blood naturally occurred in three different ‘types’. A fourth was discovered in 1902 by his colleagues Decastello and Sturli. These four blood types were later given the names by which they are known today: A, B, O, and the fourth type, AB. The discovery demonstrated that the exchange of human blood carried with it the potential danger of haemolysis: if a recipient's blood plasma contained antibodies to the donor's red blood cells, the cells would clump or disintegrate (haemolyse), leading to discomfort at best and death at worst. It therefore offered a plausible explanation of transfusion's past failures. Landsteiner's studies did not, however, promptly usher in the modern period of transfusion. Relegated to serological realms, and with medical practitioners generally using saline, blood-typing was virtually ignored by clinicians. Indeed, even after transfusion was again ‘rediscovered’ a few years later, it was rediscovered in ignorance of Landsteiner's work. ‘Typing’ blood for transfusion was not generally regarded as essential until late in World War I; and, even then — given the pressing nature of the circumstances — it was not necessarily conducted. Further, Landsteiner himself abandoned his typing work for decades, only returning to it in the 1920s. In 1939–40, he helped lead the investigations that proved the existence of rhesus types: shedding light upon why even the most careful interwar typing sometimes failed to prevent a haemolytic reaction.
Twentieth century developmentsFrom the turn of the century, the Americans had taken the lead in transfusion. On the basis of experiments on shock, American surgeon and physiologist George Washington Crile became convinced that saline could not, in fact, replace lost blood effectively. In 1905, he began to conduct transfusion experiments on humans. Using a technique pioneered by the French surgeon and future Nobel laureate Alexis Carrel, Crile connected a donor's artery to a recipient's vein, allowing direct transfusion of blood between humans. Americans began practising transfusion with some regularity before the War, even recruiting a growing stream of ‘donors’ who were paid for their blood.
World War I proved a turning point. Transfusion was imported, first by Canadians, then by US medical officers. Further encouraged by the simplicity of transfusion undertaken with sodium citrate as an anticoagulant — now, blood could be collected in a bottle, moved from room to room, and even held for a while — British and French surgeons at the front increasingly turned to blood to treat soldiers suffering from the collapse they then called ‘wound shock’. (The Germans, too, performed transfusions.) Moreover, the British and Americans undertook a special ‘anti-shock campaign’ from the summer of 1917. ‘Resuscitation Wards’ were staffed by specially-trained shock teams; in them, collapsed soldiers were given the blood of their lightly-injured brothers-in-arms in an effort to stabilize them for further surgical intervention. It was here that many British sceptics were won over to the benefits of blood transfusion.
Blood donors and blood banksIn the early 1920s, a number of hospitals assembled their own small donor panels: even using the citrate method, donors still had to go to hospital to give blood for each emergency. Initially, they were paid for their blood. Discontented with this adhoc system, some prominent British doctors began calling for a centralized donor service. In 1921, Percy Lane Oliver, master-organizer and Honorary Secretary of the local Camberwell division of the British Red Cross, started just such a system. Moreover — and, more remarkably — his system relied wholly on unpaid donors, at a time when some American hospitals were paying donors up to $100 for a pint. The fledgling voluntary service grew rapidly, being taken up in 1926 by the greater British Red Cross and providing donors to all London's voluntary hospitals by the early 1930s. Under Oliver's firm hand, the new London Blood Transfusion Service also shaped the rights and responsibilities of the modern voluntary blood donor. The London model was quickly adopted throughout Britain and thereafter by national Red Cross and other organizations in a host of other countries. Debate as to the relative merits of paid and voluntary donor systems, and their implied conceptions of blood as commodity or gift, continues to inform the direction of today's donor programmes throughout the world.
Yet, it is difficult to imagine how the current pervasiveness of donor programmes, and, indeed, of transfusion itself, could exist without the addition of another innovation: blood banking, or, the cold-storage-based exchange of blood. Though initially developed at the Rockefeller Institute in 1916 and applied on a small scale on the French front in 1918, cold storage was virtually ignored until the 1930s, when it was used in Moscow to preserve cadaver blood for later transfusion. The unconventional donors attracted as much attention as did the possibilities of the procedure. Cold storage was given further dramatic introduction to the broader world during the Spanish Civil War, where the international array of doctors attending to its casualties pooled, cooled, then distributed donated blood to the wounded. Back in the US, Chicago's Cook County Hospital applied the process of cold storage to its own blood exchange system in 1937, creating what is thought to have been the first civilian ‘blood bank’. In this odd interplay of war and peace, the place of blood banking (as well as of transfusion more generally) was firmly established in World War II.
Blood productsDuring the war, the newly-developed procedure of separating the plasma from the blood cells and drying the plasma helped fuel longstanding debates about the best fluids to transfuse in various medical conditions. Dried plasma was indefinitely storable and far more portable than blood. Rehydrated, it appeared to be more effective for example in the treatment of burns than did whole blood. Gradually, a kind of division of labour for blood components was articulated. Today, whole blood is used in relatively few circumstances — for example, to treat severe haemorrhage, and in heart bypass procedures. Red blood cells in suspension are the alternative in operative procedures, and are given to patients with severe anaemia; platelets or white blood cells can be separately extracted and given to those suffering from a lack of them. Whole plasma is transfused to treat fluid and protein loss, and plasma is also used to obtain particular components which are lacking from the blood in certain disorders. The procedure known as plasma ‘fractionation’, developed in the mid 1930s, has given rise to a host of ‘biologicals’ for infusion. These include albumin, for treating patients with burns; Factors VIII and IX, which help coagulate the blood of men with haemophilia; and immunoglobulins, to provide specific antibodies to infections such as tetanus or chickenpox, or ‘Anti-D’ which is given to rhesus-negative mothers to prevent damage to their rhesus-positive babies.
Blood's processing and transportation was facilitated, and the safety of its infusion improved, by the development of plastic ‘blood packs’ from the early 1950s. Now a familiar icon of transfusion, these packs were adopted elsewhere, but they did not become the official containers of Britain's blood until 1975, replacing glass bottles.
Despite this medicalization, blood has in many ways retained its privileged cultural status. Citing Old Testament prohibitions against consuming blood, Jehovah's Witnesses forbid the transfusion of blood into their members. More generally, blood's movement between bodies continues to rest upon donor systems that must grapple with the social meanings of a fluid at once intensely personal, medically essential, and commercially valuable. Indeed, blood's extensive processing has sometimes complicated the task of voluntary donor groups, whose staff must persuade donors that the pharmaceutically-produced powders and potions derived from their blood remain direct ‘gifts’ to those in dire medical need — not sold at a profit to ‘outside’ systems.
Current problemsTyping has become more complex, with the recognition of groups within groups, and along with this has come more sophisticated laboratory ‘matching’. A hazard that remains, and that requires meticulous screening of donors, is transmission by blood and its products of viral infections, notably hepatitis and HIV. This was tragically brought to public attention in the fate of haemophiliacs in the 1980s. Treated with biologicals derived from large pools of donated blood, many became infected with HIV and later died of complications from AIDS. The medical, ethical, and legal implications of AIDS for blood transfusion are still being determined. While efforts to clone or create synthetic blood continue, transfusion remains bloody — and, as such, intimately linked to its long cultural history.
Gunson, H. H. and and Dodsworth, H. (1996). Fifty years of blood transfusion. Transfusion Medicine, 6, supplement 1.
Keynes, G. (1922). Blood transfusion. Henry Frowde, London.
Titmuss, R. M. (1970). The gift relationship: from human blood to social policy. George Allen and Unwin, London.
See also anaemia; blood; blood groups; haemorrhage; surgery.
Blood transfusion is the process of transferring blood from one person's body to another. A severely injured person or one undergoing surgery may need extra blood to replace that which has been lost. If the extra blood is not available, the person can go into shock and die.
Folk medicine and ancient practice long considered blood to have beneficial, healing properties. Perhaps the earliest recorded case of blood transfusion was that of Pope Innocent VIII (1432-1492). The Pope was transfused in April 1492 with the blood of three young boys. The outcome indicates why transfusion attempts were rare and dangerous: all three boys died.
After William Harvey (1578-1657) explained the mechanism of blood circulation in 1628, interest in transfusion grew. An Italian physician, Giovanni Colle, gave the first concise description of a blood transfusion in 1628. An English clergyman, Francis Potter, seems to have experimented with transfusions in the 1650s. In the 1660s, the Royal Society of London (England) sponsored a series of transfusion trials. This was after Sir Christopher Wren (1632-1723), the famous architect, used a quill-and-bladder syringe to inject fluid into the vein of a dog. The injection was done to demonstrate a new method of administering medications. Richard Lower (1631-1691) continued the experiments at Oxford University in England and performed the first direct blood transfusion from one dog to another in 1665 by connecting an artery to a vein via a silver tube.
French physician, Jean Baptiste Denis (1643-1704), used Lower's technique in June of 1667 to perform a transfusion from a lamb to an ill human. Several months later, both Denis and Lower transfused blood from a sheep to a man. The promising new technique was abruptly halted in 1668 when one of Denis's transfused patients died. Even though the cause of death was poisoning by the patient's wife, transfusions were banned in France and did not become medically established in England.
In 1818 James Blundell, a physician at Guy's Hospital in London, revived the practice of transfusion by using a syringe to inject blood from human donors. At first Blundell transfused only hopeless cases, but in 1829 he used blood transfusion successfully to treat a woman with postpartum hemorrhage. Both Blundell and James N. Aveling improved the apparatus for carrying out transfusions. The technique was widely used during the Franco-Prussian War (1870-1871).
Blood transfusion remained a risky procedure. The donor's blood tended to coagulate, and recipients were likely to suffer a fatal transfusion reaction. The discovery of blood groups in 1900 solved the problem of fatal reactions. In 1914 the use of sodium citrate as an anticoagulant answered the problem of blood clotting. Austrian-American pathologist Karl Landsteiner (1868-1943) showed the existence of three distinct blood types (groups; the number rose to four in 1902) in an "ABO" system. Antigens in some types reacted adversely to antibodies in other types, causing the clumping of red cells. The clumping could fatally block blood vessels. Landsteiner's findings made it possible to identify donor and recipient blood types and thus avoid the deadly transfusion reaction in most cases. Typing of blood for transfusion began in 1907. Transfusion reaction was more fully overcome in 1940 when Landsteiner and Walter Weiner (1899-) discovered the Rhesus factor (Rh factor ), which typically causes the antigen/antibody reaction.
The Modern Procedure
At first, blood transfusion was done via direct connection between donor and recipient. George Washington Crile (1864-1943), an American surgeon, developed a standard surgical method of blood transfusion. After surgically exposing a recipient's vein and a donor's artery, a physician clamped shut the vessels and attached a small tube as a conduit between them. When the surgical clamps clamps were opened, blood flowed from donor to recipient. Edward Lindeman took the procedure out of the operating room in 1913 with a simple needle puncture technique. This method also allowed exact measurement of the amounts of blood being transfused. With all these advances in place, blood transfusion spread rapidly and became firmly established during World War I (1914-1918).
Once blood transfusion was in wide use, storage of donated blood became a problem. The first "blood bank" was set up by Dr. Bernard Fantus in 1937 at Cook County Hospital in Chicago, Illinois. A method of preserving red blood cells for up to 21 days with acid citrate dextrose was developed in the 1940s. African-American surgeon Charles Richard Drew studied in depth a way to preserve and store blood ready for instant use. He discovered that plasma could be processed and reserved for a long time, and transfused without regard to blood type or matching in place of whole blood. Drew established blood banks in England and the United States during World War II (1939-1945). These banks saved thousands of lives by making blood transfusion available to the wounded.
Today, blood transfusion remains a widely used and critical medical procedure. After World War II, methods were developed for separating the various constituents of blood. As a result, in addition to whole blood, a patient may receive "packed" red cells, granulocytes (white cells), platelets, plasma, or plasma components. Both natural and artificial blood substitutes are also used. Perhaps most serious of the remaining risks of blood transfusion is the possibility of transmitting disease via the donor's blood. Of special concern is the transmittal of the HIV virus and hepatitis. For this reason, donated blood is carefully screened.
blood transfusion, transfer of blood from one person to another, or from one animal to another of the same species. Transfusions are performed to replace a substantial loss of blood and as supportive treatment in certain diseases and blood disorders. When whole blood is not needed, or when it is not available, plasma, the fluid of the blood without the blood cells, can be given. Alternately, such components of the blood as red cells, white cells, or platelets may be given for particular deficiencies. Blood substitutes, which are under development, are expected ultimately to ease the chronic short supply of blood and to alleviate certain storage and compatibility problems.
In whole-blood transfusions, the blood of the donor must be compatible with that of the recipient. Blood is incompatible when certain factors in red blood cells and plasma differ in donor and recipient; when that occurs, agglutinins (i.e., antibodies) in the recipient's blood will clump with the red blood cells of the donor's blood. The most frequent blood transfusion reactions are caused by substances of the ABO blood group system and the Rh factor system. In the ABO system, group AB individuals are known as universal recipients, because they can accept A, B, AB, or O donor blood. Persons with O blood are sometimes called universal donors, since their red cells are unlikely to be agglutinated by the blood of any other group. In the Rh factor system, agglutinins are not produced spontaneously in an individual but only in response to previous exposure to Rh antigens, as in some earlier transfusion. Transfusion reactions involving incompatibility eventually cause hemolysis, or disruption of donor cells. The resulting liberation of hemoglobin into the circulatory system, causing jaundice and kidney damage, can be lethal.
In addition to providing for the compatibility of blood groups in transfusion, it is necessary to determine that the donor's blood is free of organisms that might cause syphilis, malaria, serum hepatitis, or HIV, the virus believed to cause AIDS. Allergic reactions to transfusions may occur in cases where allergic antibodies have been transmitted from the donor's blood, possibly because of some type of food recently ingested by the donor. These problems have increased the popularity of autologous transfusions, transfusions using a person's own blood, which has been donated ahead of time. See blood bank.