Hektoen International

A Journal of Medical Humanities

Medical and scientific innovations arising from warfare

Brian Omondi
Nairobi, Kenya

 

Perhaps the only bright side of war is that it impels nations to make medical and scientific innovations. War has long been portrayed as being the best school for surgeons and even for doctors.1 An association between medical services and the military can be traced back to ancient Greece, and the link has become stronger ever since, so that most wars seem to have led to breakthroughs in medicine. In particular, the two World Wars of the last century have had a significant impact in terms of medical advances and technological discoveries.

The United States military long has been associated with the development of licensed vaccines. Early on, military scientists developed hepatitis A, hepatitis B, and rubella vaccines.2 In 2011 they adopted the adenovirus vaccines type 7 and 4 to stop severe respiratory diseases, as it was discovered that the virus thrived in the barracks and resulted in lost training time for the soldiers.  Evaluating and testing of the vaccine was done at the behest of the Navy and Army basic training command, leading to its approval by the American Food and Drug Administration (FDA).

Many medical and scientific innovations during World War I helped prevent the spread of contagious diseases. Mobile laboratories were set up, typhoid vaccines and tetanus antitoxin developed which substantially reduced the spread of disease.3 Doctors were reminded to be vigilant of disease outbreaks such as the 1918 Spanish flu. There were developments in neurosurgery, psychiatry, and orthopedics, some later of great use in treating the civilian population. A full blown examination of the relation between mosquito bites and malaria was done during World War I. Sir Neil Hamilton Fairley used Australian soldiers to further study malaria, thereby paving the way for Garnham and Shortt’s work in 1948. Fairley showed that taking one mepacrine tablet a day kept malaria at bay; and the Germans likewise matched his work by producing atabrine.

World War II contributed the growth of blood transfusion. It was primitive at the onset of the war, later advanced and became sophisticated.4 Blood was stored and distributed when it was needed. The British Army routinely used blood transfusion to treat their wounded soldiers. At first blood was directly transfused from one person to another, but during World War I Captain Oswald Robertson, a U.S. Army doctor, realized that storing blood beforehand was an important step in treating wounded soldiers faster.5 In 1917 he set up the first blood bank, preventing blood coagulation by using sodium citrate, keeping the blood for twenty days on ice, then transporting it to sick bays and other stations to be used in surgery. Even though the process was primitive because of difficulties in blood typing, it tripled the survival rates of wounded soldiers. In World War II blood transfusion further improved and because of these advances continues to be used in hospitals throughout the world.

World War II saw the advanced use of penicillin. Though discovered earlier, it was produced during that war in huge quantities.6 The primary motive for this was to treat the thousands of soldiers suffering from syphilis and gonorrhea, a plague to the military for several decades. Other antibiotics followed, as were treatments of tropical diseases that had daunted American soldiers from fighting in tropical climates, also pesticides to kill mosquitoes and prevent malaria. The aviation medicines invented during the World War II also had a substantial impact in enabling people to fly safely at high altitudes for longer periods of time. Developments such as supplemental oxygen, night vision studies, safety belts and crash helmets all emerged from aviation medicine.

The medical and scientific innovations that arose from World Wars I and II played a big role in improving medical services. Antibiotics improved wound care, as did the Carrel-Dakin method of applying sodium hypochlorite to damaged or wounded tissues, a technique used in some hospitals today.7

The injuries in the First World War were both physical and mental.8 Many soldiers became used to living in muddy trenches filled with exploding shells, rotting corpses, and rats, but some did not and developed shell shock, a mental illness that made them paralyzed, hysterical, disoriented, and unable to obey orders. Some medical practitioners saw this as a physical condition instigated by shell blast percussion on the brain tissues, others saw it as an acute type of psychological distress. The controversies led to extensive research in the early twentieth century by psychoanalytic movements and are considered to be the precursors of cognitive behavioral therapy. Currently the modern military is more knowledgeable about post-traumatic stress disorder as well as psychological trauma.

WWII resulted in an expansion of the science of nutrition, identifying what minerals and vitamins are essential for the human body and determining the number of calories burnt during different activities.9 The military’s top priority was proper food handling, storage, and preparation. Soldiers’ rations were carefully formulated to offer the maximum amount of energy and nutrition while providing taste and variety. The nutrition challenges were met through a combination of laboratory and kitchen experimentations. Battlefield medics developed the “D ration” which came in the design of a fortified chocolate bar. It had high calories and was very useful during emergencies.10 Three pieces of the chocolate bars provided a soldier with more than 1,500 calories of energy. Furthermore, the “D ration” was made to withstand different temperatures, weigh at least four ounces, and have good taste. By the end of WWII, billions of these rations were available to the United States military and also other armies around the world.

Since the days of Ancient Greece military medicine has changed greatly. The doctors of World War I and II came to understand the link between exposure to germs in dirty battlefields and infections. Blood transfusion, penicillin and vaccines were developed, and there were breakthroughs in nutrition, neurosurgery, psychiatry, and orthopedics, all attributable to war.11

 

References

  1. Sebastian Junger. 2010. War. New York: Twelve.
  2. Mark Harrison. “Medicine and the management of modern warfare: an introduction.” Clio Medica/TheWellcome Series in the History of Medicine 55, no. 1 (2005): 1-27.
  3. Philip Aspden. 2006. Medical innovation in the changing healthcare marketplace conference summary. Washington, DC: National Academy Press.
  4. Jerold Brown. 2001. Historical dictionary of the US Army. Westport, Conn. [u.a.]: Greenwood Press.
  5. Ashley Ekins and Elizabeth Stewart. 2011. War wounds medicine and the trauma of conflict. Wollombi: Exisle Pub.
  6. Brown, Historical dictionary of the US Army, 55
  7. Paul Auerbach. 2012. Wilderness medicine. Philadelphia, PA: Elsevier/Mosby.
  8. Sujata Sarabahi, Tiwari, and Bajaj. 2012. Principles and practice of wound care. New Delhi: Jaypee Brothers Medical.
  9. William Young and Nancy Young. 2010. World War II and the postwar years in America: a historical and cultural encyclopedia. Santa Barbara, Calif: ABC-CLIO.
  10. Craske, Michelle G. 2010. Cognitive-behavioral therapy. Washington, DC: American Psychological Association.
  11. Robert Leslie. 2011. The medical response to the trench diseases in World War One. Newcastle upon Tyne: Cambridge Scholars Pub.

 


 

BRIAN SAMMY OMONDI was born in Mombasa, Kenya and raised in Nairobi. Currently he is a fourth year student undertaking a Bachelor of Science in Tourism Management at the University of Eldoret.

 

Spring 2016  |  Sections  |  Blood

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.