It took several centuries for the medical community to arrive at the modern-day transfusion medicine. This WellnessKeen post outlines the history of blood transfusion, including the shift from bathing and drinking of blood, to the early transfusion attempts, followed by major scientific developments.
Did You Know?
The physicians in France speculated that blood carried the characteristics of the donor, and that transfusion can bring about a change in species of the recipient animal or individual.
The process of introducing blood components or whole blood through the intravenous route is known as blood transfusion. The transfused blood may be the patient’s own blood that was salvaged earlier (autologous transfusion); or may be obtained from someone else (allogeneic transfusion).
Records indicating that blood was recognized as a vital part of life, are seen in Greek mythology, as well as in the observations by ancient Greek scholars like Aristotle. The studies of blood, human anatomy, and diseases, laid the foundation for blood transfusion. The blood loss during the two World Wars further triggered several developments in the arena of blood transfusion.
The field of transfusion medicine was born with the experiments on animal blood transfusions and the early efforts to transfuse human blood. Given below is an elaborate account of the history of this red gold that flows in our body, intertwined with the history of blood transfusion.
Up to the 16th Century
Since ancient times, blood was considered to be essential for life, but the how and why of it was not clearly understood. The Egyptians believed that bathing in animal blood provided rejuvenation, whereas some Egyptian physicians believed that bleeding can cure diseases. The Roman gladiators used to drink the blood of fallen warriors in order to gain courage.
It was in 500 BC that animal dissections took force, and foundations for the science of anatomy were laid down. The Greek physician Hippocrates and his follower Claudius Galen promoted that the body composed of four fluid or humors-blood, phlegm, black bile and yellow bile-and that an imbalance in their amounts due to the accumulation of one of the humors, caused a disease. This led to the belief that bleeding a person and removing the humors can prevent as well as cure a disease. It was from 130 to 200 AD that Galen observed the difference between arteries and veins, and suggested that blood is formed in the liver and transported by veins.
With animal and human dissections in force, several organs including the heart, lungs, etc., were studied in detail with respect to their structure and functions. Sometime between 1200 to 1500 AD, the flow of blood to and from the lungs (pulmonary circulation) was discovered, along with more details about the flow of blood in the heart.
Amidst this scientific development and the increasing importance of blood in health and disease, was born the first transfusion myth. A story stated that, in 1492, a Jewish physician tried to save Pope Innocent VIII by suggesting the oral consumption of blood obtained from three young men. However, there is no evidence that the Pope accepted this. He succumbed to death, and so did the three youngsters who donated their blood.
In the mid 1500s, an Italian student named Andreas Vesalius explored human anatomy and published detailed drawings of the same. The theory of humors got buried in history with the emerging anatomical studies.
17th Century
The 17th century saw a series of blood transfusion attempts, and the concept of blood transfusion is said to have originated from a German doctor named Andreas Libavius. In 1614, he suggested that the transfusion of blood from the arteries of a young person into those of an old man can restore his health and energy. However, he never actually attempted such a procedure.
The Circulatory System
One of the most significant events during this century was the discovery of circulation of blood, by William Harvey in 1615 (published in 1628). He was the first to describe that blood is pumped to the entire body by the heart, and that it is not used up or consumed but keeps circulating constantly.
Animal Blood Transfusions
It was in England in 1665 that the first animal-to-animal blood transfusion was achieved successfully by Richard Lower. He used a quill and connected the artery of a donor dog to a vein of the recipient dog.
In 1667 in France, a physician named Jean-Baptiste Denis attempted animal-to-human blood transfusion in order to cure a boy suffering from fever. Surprisingly, the boy survived, but did show symptoms of hemolytic anemia, a condition which was not known at that time. He repeated the experiment on two more individuals successfully, but his attempt failed on the fourth patient named Antoine Mauroy, who was suffering from a mental illness. Denis transfused Mauroy with calf’s blood thrice. Mauroy survived the first two attempts but died after the third one. His death invoked a great controversy, followed by a murder trial against Denis. However, investigation revealed the reason of his death to be arsenic poisoning.
Controversies and Roadblocks
The consequence of this controversy was the prohibition of blood transfusion by the French Parliament in 1670. At the same time, the beliefs about soul and its connection to blood led to an opposition against the practice of transfusion in England. The British Parliament and the Vatican also imposed a ban on blood transfusion.
The next 150 years saw a huge pause in the science of transfusion medicine, and only few exceptional attempts of blood transfusion were observed in the 18th century.
“It is probably fortunate that blood transfusion took a nap for over one and a half centuries. Ignorance of antisepsis, asepsis, and immunology-all 19th century discoveries-would have resulted in countless disasters.”
– N.S.R. Maluf (Historian)
18th and 19th Century
The arena of blood transfusion was revived in the 1796 by Erasmus Darwin (the grandfather of the naturalist Charles Darwin), who suggested blood transfusion in order to deal with nutritional inadequacies.
The Early Efforts to Transfuse Human Blood
The first human-to-human transfusion in the US was performed in 1795 by Philip Syng Physick, an American physician, in Philadelphia. However, he did not like to publish his work, and a reference to such a transfusion was found in a footnote in one of his medical journals.
However, revival of the world of blood transfusion has been credited to James Blundell, an obstetrician at Guy’s Hospital and St. Thomas’s Hospital in London, whose experiments in 1818-1828 gave an impetus to transfusion medicine. He successfully performed animal-to-animal transfusion, but understood the incompatibility of inter-species or animal-to-human transfusions. Hence, he attempted human-to-human transfusion in order to deal with acute hemorrhage that occurred in women during childbirth. He invented a device that facilitated indirect transfusion. The blood from the donor was collected in a cup-like receptacle, through which it was introduced into the recipient in a gravity-assisted manner.
Using this method, his patient was transfused with four ounces of her husband’s blood. The woman survived, and Blundell went ahead to perform transfusions in more of his patients. However, not all his attempts were successful, and it was mandated that blood transfusion be used only as the last resort.
Blundell’s experiments were revolutionary in that era, because he used transfusion to treat actual blood loss, unlike others who attempted to cure diseases and/or get rid of sins through blood transfers.
Antiseptics and Blood Substitutes
The increased understanding of bacterial infections and contamination, as well as the use of antiseptics, helped to reduce infections that occurred during blood transfusion. Nevertheless, reactions due to incompatibility between the blood of individuals blocked the progress of transfusion. The period between 1873 and 1880 saw the use of milk as substitutes for blood, which was replaced by saline in 1884 due to the adverse reactions observed against milk.
20th Century
The 20th century was the period of several important discoveries, which makes it the most interesting and progressive era as far as the development of transfusion medicine is concerned.
ABO Blood Types
The first amongst these was the discovery of A, B, O blood groups by Karl Landsteiner in 1901, which won him the Nobel Prize in 1930. The fourth blood type AB was discovered in 1902 by his colleagues Alfred von Decastello and Adriano Sturli.
The next logical leap that matching the blood types of the donor and the recipient may help to increase the safety of the procedure, was provided in 1907 by Dr. Ludvig Hektoen from Chicago. Using this logic, in 1907 an American hematologist was the first to match blood types and then proceed to transfusion. He also discovered that O type blood can be transfused into individuals with any other blood group.
In 1912, Roger Lee demonstrated the universal acceptability of the AB blood group. He discovered that blood of any of the four types can be safely transfused into an individual with blood type AB.
Direct Blood Transfers
During this time, it was known that blood needs to be transfused quickly or else it clots. In order to facilitate quick and direct transfusion, a French surgeon Alexis Carrel innovated a technique in 1908, to stitch the artery of the donor to the vein of the recipient. Although this did not work for transfusion medicine, he explored this procedure into transplantation of tissues and organs, and was awarded the Nobel prize in 1912.
The World War I setting triggered the development of several techniques for the transfer of blood, using innovative types of tubes and syringes. In the period between 1916 to 1918, a Canadian Army Major Lawrence Bruce Robertson, devised the syringe-cannula technique that enable quick blood transfers, and thus saved the lives of several soldiers.
Anticoagulants
During the first World War, the need for blood was high owing to the high number of casualties, but the storage of blood was impossible. This triggered another set of major advancements in the field of blood transfusion.
In 1914, Adolph Hustin discovered that adding sodium citrate to blood could prevent it from clotting. But sodium citrate caused several side effects in humans. In 1915, this problem was solved by Luis Agote in Argentina and Richard Lewisohn in the US, who determined the appropriate concentration to be added so that it would be harmless for the recipient individual. Yet another anticoagulant-heparin-was discovered in 1916, and it is used even today in several blood preservation procedures.
Another step of progress that enabled storage of blood was the demonstration of refrigeration of anticoagulated blood by Richard Weil in 1915. This was followed by the development of citrate-glucose solution, by Francis P. Rous and J. R.Turner, which enabled the storage of blood for a few weeks after collection. They demonstrated that such stored blood can be safely transfused into a needy individual.
The First Blood Depot
The ability to store blood translated into blood banking, and this idea has been credited to Capt. Oswald H. Robertson of the US Army Medical Officer Reserve Corps. Equipped with special training in blood storage, from the Rockefeller Institute, he devised a way to store blood successfully for 21 days. He advocated the typing and storage of blood in advance rather than the eleventh moment collection of blood. He also developed transfusion bottles for the storage of blood. In 1918, while serving in France, he established the first depot or bank containing anticoagulated O type blood. However, this idea was limited to the army medical facilities only, and took several decades to reach civilians.
The Royal Army Medical Corps of Britain had declared transfusion as ‘the most important medical advance of the war’, and soon adopted the technique.
More Blood Types
Karl Landsteiner and his colleagues further studied blood antigens and identified the MNS and P blood typing system in 1927. However, the discovery that proved to be a major breakthrough for transfusion science was the discovery of Rhesus blood typing system in the late 1930s and early 1940s. This system is the second most clinically important blood typing system.
Blood Transfusion in World War II
The developments in blood banks were coupled with the development of better storage containers for blood, as well as innovative techniques to purify and store blood components. The second World War that lasted from 1939 to 1945 transformed blood transfusions and blood banks from its immature state to a very well-organized and sophisticated clinical practice. Cryopreservation techniques, isolation of blood plasma, red blood cells, as well as the development of better containers, and the efficient storage of individual blood components are some of the major achievements that occurred during this period.
Although it was identified that blood plasma can be used as a substitute for whole blood, there was a shortage of plasma during this war period. The result was the ‘Blood for Britain’ and ‘Plasma for Britain’ campaigns initiated by Dr. Charles Drew, through which people were encouraged to donate blood. Blood was thus collected from all parts of the nation, processed to obtain dried plasma which was shipped to Great Britain. This dried plasma was mixed with water and administered to soldiers who had suffered blood loss. The dried plasma need not require blood typing and could be administered in as little as three minutes.
Plastic Bag for Blood
The next step in the evolution of blood transfusion was the use of plastic bags for storing blood. Carl Walter and W.P. Murphy, Jr., introduced the use of plasticized PVC as an ideal material for blood bags, in 1950. The fragile glass containers were then replaced by plastic bags, as a result of which, handling, storage, and transport of blood became much more convenient.
The remaining half of the 20th century saw the identification of blood-borne viruses like Human Immunodeficiency Virus (HIV), Hepatitis viruses, etc. This led to the development of different pathogen-specific screening tests, and the donated blood was tested before storage. Standard criteria were established in order to ensure the safety of transfusion and prevention of spread of blood-borne pathogens.
The recent period is the period of genomics that provides more accurate detection and screening for pathogens, as well as the identification and development of substitutes for blood and its components. Although blood transfusion is a highly safe procedure today, it is not devoid of risks; and attempts to develop substitutes and to make the procedure completely risk-free continue even today.