Hektoen International

A Journal of Medical Humanities

Physician: study thyself

Susan Hurley
Victoria, Australia

Jesse William Lazear, an American physician who died in 1900 after a self-experiment.

In 2016 one man died and five others suffered brain damage during a drug trial in Rennes, France.1 A similar disaster occurred during the 2006 London trial of a novel monoclonal antibody: six men experienced an immediate systemic inflammatory response and became critically ill with multi-organ failure.2 These tragedies are a poignant reminder that the safety of a drug cannot be reliably predicted from animal and laboratory experiments. First-in-human (phase one) clinical trials are therefore inherently perilous. Yet the participants in the Rennes and London studies were not afflicted by the conditions the drugs were designed to treat. They were healthy men. On completion of the Rennes trial the volunteers would have received €1,900 each;3 the London study remuneration was £2,000 per person.4 These men were “guinea-pigging”, an activity that underpins drug development today.

It was not always thus. In the nineteenth century, and well into the twentieth, scientists and doctors who needed a warm body on which to test their theory or new medicine often offered up themselves. Self-experimentation has been a crucial step in developing our understanding of the causes and natural history of many diseases. Lawrence Altman documents its fascinating history in his book Who Goes First?5 He describes how self-experimentation has been responsible for many therapeutic advances and for the existence of whole areas of medicine such as anesthesia. At least nine Nobel Prizes have been awarded to physicians and scientists for work underpinned by their self-experiments,6,7 the three most recent being to: Werner Forssman in 1956, who threaded a ureteral catheter into his right atrium, thereby pioneering cardiac catheterization; the Australian gastroenterologist Barry Marshall in 2005, who swallowed a broth of the bacterium now known as Helicobacter pylori to prove that it causes gastritis and ultimately stomach ulcers; and Ralph Steinman in 2011, who, after being diagnosed with pancreatic cancer, was administered three vaccines based on his dendritic cell research.

Many self-experimenters were spectacularly heroic.5 Stubbins Ffirth, a University of Pennsylvania medical student, for example. In the early 1800s he set out to disprove the then conventional wisdom that yellow fever was a communicable disease. Nursing care was the only treatment for yellow fever, but patients were often abandoned to die by their families who feared becoming infected. Ffirth slept on a bed where a patient lay ill with yellow fever. He swallowed yellow-fever patients’ black vomit, inserted it into a cut he made in his forearm and dropped it in his eye. He even heated a vomit sample and inhaled the resulting gas. Ffirth then broadened his experiments, swallowing saliva and blood from patients and self-injecting their blood. He was right—yellow fever is not contagious—but it is transmissible by blood. He survived because the patient whose blood he transfused had presumably cleared the virus.

Pierre-Fleurus Touéry was another intrepid nineteenth century self-experimenter. A chemist who had discovered that charcoal is an antidote to certain poisons, he is reported to have swallowed ten times the fatal dose of strychnine, mixed with charcoal, in front of an assembly of the French Academy of Medicine.

Self-experiments like Touéry’s and Ffirth’s were valiant responses to colleagues’ skepticism. The physicians’ principle “first do no harm” has motivated others, in particular vaccine researchers, who were once among the most committed self-experimenters. There was even a club that embodied the tradition: the Pasteurian Club. There were no meetings or medals, just a common bond: all club members had tried the vaccine they were developing on themselves.

Waldemar Haffkine, for example, a member of Pasteur’s team, tested his primitive cholera vaccine on himself in 1892. Almroth Wright also tested his typhoid vaccine on himself , as did all the major polio vaccine researchers: Brodie, Park, Kolmer, Koprowski, and Salk and Sabin, the two who were ultimately successful. “You wouldn’t do unto others that which you wouldn’t do unto yourself,” Jonas Salk said, describing self-experimentation as “ritual and symbolic.”5,8 It was also risky. Hilary Koprowski had treated his live polio vaccine to weaken the virus. But had it been weakened enough? Koprowski bravely swallowed a dose. He said it tasted like cod liver oil.

Sometimes, though, Koprowski drew a line.5 In the 1940s he tested a Colorado tick virus vaccine on himself, and several months later exposed himself to a challenge dose of the virus. Fortunately, that vaccine was effective. But almost thirty years later, when developing a rabies vaccine, Koprowski decided it was imprudent to expose himself to the usually fatal rabies virus. Likewise Dr Daniel Zagury, who injected himself with his own experimental HIV vaccine. He found that it stimulated antibodies, but when asked if he would now inject himself with HIV to test the vaccine’s effectiveness, Zagury responded: “Do you think I’m crazy?”5,8

Neither Koprowski nor the other polio researchers contracted polio.5 Nor did Haffkine get cholera or Almroth Wright typhoid. Later though, when Wright was trying to develop a vaccine against brucellosis, a colleague injected him with brucella bacteria and he suffered a painful case of brucellosis. His vaccine was ineffective.

In Who Goes First? Altman describes numerous other casualties of self-experimentation. David Clyde, who was trying to develop a malaria vaccine, allowed himself to be bitten by malaria-infected mosquitoes and contracted malaria. The surgeon William Halsted, searching for a local anaesthetic to replace general anaesthesia during minor surgery, became addicted to cocaine. And then there is self-experimenter extraordinaire, Dr. Thomas Brittingham, whose workplace in Missouri was known as the Kamikaze School of Medicine, with good reason. Brittingham and colleagues injected themselves with blood from patients afflicted by blood disorders. Brittingham even received blood transfusions from a woman with chronic myelogenous leukemia. Despite suffering a severe reaction, he continued self-experimenting. After being injected with blood from a patient with aplastic anaemia he went into shock and was diagnosed with hepatitis.

There have been self-experimenter fatalities too. Dr. Jesse Lazear, a member of a commission trying to determine whether mosquitoes transmit yellow fever, acted on a pledge: “. . . that they [commission members] should themselves be bitten [by mosquitoes], and subject themselves to the same risk that necessity compelled them to impose on others.” Infected mosquitoes do transmit yellow fever. Lazear died finding that out.

Self-experimentation has not been confined to universities and hospitals. In the 1940s, pharmacologists at a Danish pharmaceutical company, Medicinalco, routinely tested new drugs on themselves. They were known as the “Death Battalion.” Medicinalco had an incentive scheme for them. Anyone who gave a blood sample got a glass of port, and at the annual company dinner dance the scientist who had been the most prolific self-experimenter got a prize—a plastic skeleton named “Jacob.” Self-experimentation at Medicinalco led to the serendipitous discovery that disulfiram causes intolerance to alcohol and can therefore be used to treat alcohol dependency. Erik Jacobsen, the medical research director, noticed that he became unwell after his lunchtime beer, and Jens Hald, one of his researchers, had a similar reaction after an evening cognac. Both had taken disulfiram, which they were testing as a treatment for intestinal worms.

Three-quarters of a century later, self-experimentation with drug candidates is no longer the norm, if it occurs at all. Pharmaceutical development is now a regulated, administratively complex process with high financial stakes. The scientific detachment needed for unbiased assessments of outcomes could be compromised if the drug’s inventor was also a trial participant. In fact, though, the inventor may have already sold or licensed their intellectual property and may be only peripherally involved in their drug’s first clinical trial. Today, those who have the most to gain financially from a drug’s successful development are likely to be pharmaceutical company investors and employees. First-in-human trials, like those in Rennes and London, are typically run by multi-national businesses, known as contract research organizations or CROs. These CROs recruit the human guinea pigs who “go first.”

This commercialization of a once noble act has not been without controversy. Trial participants are often students and the unemployed, or underemployed, prompting concerns about the ethics of experimenting on people whose acute need for the proffered payment could cloud their assessment of the risks.9 Also of concern is the existence of for-profit companies whose business is to review the ethics of trials. If one company rejects a study as unethical the sponsor can potentially seek approval from another.10 In 2006, a 675-bed drug-testing site, the largest in North America, was demolished after reports that its owner was “paying undocumented immigrants to participate in drug trials under ethically dubious conditions.”9

Who then should go first? It is perhaps worth noting that Pasteur did not experiment on himself, apparently much to his chagrin. Pasteur was a French hero. His colleagues insisted that they, rather than their boss, be given trial doses of his rabies vaccine. And although Walter Reed is credited with being the mastermind of the yellow fever experiments that led to Jesse Lazear’s death, and was a party to the self-experimentation pact, he did not expose himself to mosquitoes. There is no evidence that Reed coerced Lazear to go first, or Pasteur his colleagues.5 Nor is there evidence that the 24-year old technician at John Hopkins, who, in 2001 suffered a fatal reaction to hexamethonium during a study in the centre where she worked, was pressured to volunteer.11 Yet the inevitable disparity in power between young scientists and their supervisors amplifies the concerns such a tragedy evokes. Who should go first is now a difficult question to answer, but it should be asked.

References

  1. Eddleston M, Cohen AF, Webb DJ. Implications of the BIA-102474-101 study for review of first-into-human clinical trials. British Journal of Clinical Pharmacology. 2016;81(4):582-586.
  2. Suntharalingam G, Perry MR, Ward S, et al. Cytokine storm in a phase 1 trial of the anti-CD28 monoclonal antibody TGN1412. N Engl J Med. 2006;355(10):1018-1028.
  3. Enserink M. What we know so far about the clinical trial disaster in France. Science AAAS. Jan 15, 2016. http://www.sciencemag.org/news/2016/01/what-we-know-so-far-about-clinical-trial-disaster-france.
  4. Rosenthal E. When drug trials go horribly wrong. The New York Times. April 7, 2006. http://www.nytimes.com/2006/04/07/world/europe/when-drug-trials-go-horribly-wrong.html
  5. Altman LK. Who goes first?: the story of self-experimentation in medicine. New York: Random House; 1987.
  6. Gravitz L. A fight for life that united a field. Nature. 2011;478(7368):163-164.
  7. Weisse AB. Self-experimentation and its role in medical research. Tex Heart Inst J. 2012;39(1):51-54.
  8. Smith JS. Patenting the sun: polio and the Salk vaccine. New York: William Morrow and Company, Inc; 1990.
  9. Elliott C. Guinea-Pigging. The New Yorker. Jan 7, 2008. http://www.newyorker.com/magazine/2008/01/07/guinea-pigging
  10. Elliott C. The best-selling, billion-dollar pills tested on homeless people. Matter. Jul 28, 2014. https://medium.com/matter/did-big-pharma-test-your-meds-on-homeless-people-a6d8d3fc7dfe – .6xdndgeld
  11. Ogilvie RI. The death of a volunteer research subject: lessons to be learned. Canadian Medical Association Journal. 2001;165(10):1335-1337.

SUSAN HURLEY, MPharm, MS (Biostatistics), PhD, is a health economist based in Melbourne, Australia. Her research on the effectiveness and cost-effectiveness of public health programs and pharmaceuticals has been published in journals including The Lancet, the Journal of The National Cancer Institute, and Epidemiologic Reviews. She is also a writer whose work has appeared in The Australian newspaper, and magazines such as Kill Your Darlings, Great Walks, and The Big Issue.

Winter 2017

|

|

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.