Human Experimentation: Who are the Guinea Pigs?

This essay is reproduced here as it appeared in the print edition of the original Science for the People magazine. These web-formatted archives are preserved complete with typographical errors and available for reference and educational and activist use. Scanned PDFs of the back issues can be browsed by headline at the website for the 2014 SftP conference held at UMass-Amherst. For more information or to support the project, email sftp.publishing@gmail.com

Human Experimentation: Who are the Guinea Pigs?

by David Ozonoff

‘Science for the People’ Vol. 9, No. 2, March/April 1977, p. 18–21

David Ozonoff teaches humanities and the history of the public health movement. He was active in the Medical Committee for Human Rights. 

Nobody, least of all medicine’s liberals, favors putting patients at risk by subjecting them to wanton experimentation, whatever the potential benefits. When left at the level of abstraction there is little controversy in this proposition, and it is easy to be on the side of the angels. But when concrete instances are considered, difficulties at once become apparent. These difficulties are most important when they arise not in the context of outlandish examples of exploitation of human subjects but rather in the context of respected academic medicine. 

Consider, for example, an editorial in the prestigious New England Journal of Medicine which sets out the dilemma of the academic clinician and attempts to formulate a solution. Its author, Dr. Franz J. Ingelfinger reaffirmed in this piece what everyone familiar with the academic medical scene has known for some time:1 “Even journals of the greatest probity contain accounts of experiments in which children are exposed, without potential direct benefit to themselves, to sensitivity reactions, to urinary-bladder puncture, and to radioactive- substances.” “In each of these instances the risk to the subject is extremely small,” he added. Yet even these “extremely small risks would be impermissible under the International Code of Ethics of the World Medical Association, which states clearly that “under no circumstances is a doctor permitted to do anything that would weaken the physical or mental resistance of a human being except from strictly therapeutic or prophylactic indications imposed in the interest of his patient.”2 It was therefore Dr. Ingelfinger’s opinion that “extreme positions” like those put forward by the Code were neither in line with what is practiced, nor with what is practicable. 

If such a Code is too strict, then what constitutes practicable yet proper ethical guidance in this matter? Ingelfinger suggests that experimentation is permissible when the risks involved are “small and justifiable.” There are clearly grave difficulties here in deciding the size of risk and the extent of justification as well as specifying who should make these judgments and determinations. Without denying the crucial importance of these questions, I would like to examine another aspect of this problem by considering its underlying assumptions. Dr. Ingelfinger in the same editorial obligingly tells us what these assumptions are: “Society does have some rights vis-a-vis the individual, not only in matters pertaining to war and immunization, but also in searching for improved methods to control disease.”3 This is, I believe, the essence of a liberal solution. 

By placing experimentation for the good of society in the same context as the right to conscript for war, or the right to require immunization, Dr. Ingelfinger identifies the issue as one involving the balance between the rights of the individual and the rights of the collectivity. This is not a novel or controversial way of putting it. On the other hand, there is something profoundly disturbing when the central question is left in this form. We have on the one hand individuals with certain rights of various magnitudes or significance and, on the other, there is society, with different rights. The formulation, as given, implies that all individuals are equally likely to pay the costs, while the whole society will receive the benefit. But in saying that society claims its rights over individuals by drafting people for war there are many things left unsaid. Wars are most frequently fought over issues or for reasons that do not threaten or benefit all segments of a society equally. All members of the society are not equally likely to be drafted. There is a systematic bias in “choosing” those that must fight and die. What I am saying is that a cost-benefit formulation of a social issue is incomplete unless one remembers to ask, “Cost to whom, and Benefit to whom?” 

Consider now society’s “rights vis-a-vis the individual… pertaining … to immunization.” Here the difference between the ideal and the actual situation seems insignificant. Compulsory immunization and public health policy in general are palpably “for the common good” and expect a “common obligation.” 

Closer examination reveals a more complex picture. Since the rise of the public health movement in the mid-nineteenth century, a number of recurring themes have accompanied much or all of the public policy relating to health care.4 “Sanitary reform,” as this movement was called, was initially obsessed with the social tensions and disorders accompanying urbanization and industrialization. Reformers believed it was their task to bring behavior concerning personal hygiene and temperate living into line with universally valid laws of Nature. It was no coincidence that these supposedly universal laws were also those which were vital to the economic interests of the entrepreneurial classes. Moreover, those “laws” particularly emphasized self-restraint and moderation, two elements of character especially significant in a world where social strife was greatly feared by those who stood to lose from such strife. The message of sanitary reform was consistent and explicit: disharmony in the social order went hand in hand with disharmony in bodily processes, accounting for the high incidence of disease and death so obvious among the lower classes.5 It is often said that the sanitary movement’s triumph consisted in recognizing the importance of environment and living conditions on health. Although  this is certainly true, it must be noted that they put the blame squarely on the individual. 

This ideology of sanitary reform implied to rich and poor alike that the poor sections of town were logically the centers of moral corruption, vice, and disease. But epidemics that started there could erupt to menace the entire community. The slums were therefore the special targets of campaigns to flush the streets of refuse (usually with municipally supplied water), and intensive campaigns to disinfect cholera nests with chloride of lime or something similar. Since cholera is a waterborne disease these measures were entirely ineffectual. Yet they persisted, because these and similar actions were meant not only as prudent attempts to protect the worthy of the city, but also as an object lesson for the poor whose depravity required constant emphasis along with eternal vigilance. With the advent of effective immunization for many communicable diseases and the disappearance of other diseases by the introduction of pure piped water, public health practice underwent a transformation that by 1920 saw it almost completely subordinated to medical practice and medical practitioners. Preventive medical care became the responsibility of our business oriented system of health care delivery, resulting in a predictable distribution of immune protection in our nation’s children. The Center for Disease Control, for example, estimates that today 37 percent of all school children in this country have not been immunized against measles, polio, diptheria, pertussis and tetanus, and that of these 37 percent, the distribution is heavily skewed towards the poor.6 Even where immunization is compulsory, as it has been for measles vaccination of all school age children in New York State since 1968, distribution of measles protection follows the same patterns as the distribution of nearly all similar goods and services in our society. A New York survey in 1970-71 showed that only 74 percent of inner city children in the five largest upstate cities were immunized, as opposed to 91 percent of the children from more affluent areas.7

Neither in analysing the draft, nor for an attempt at understanding the more general case of public health practice, does the “individual versus society” formula come to grips with certain important social facts of life. Substantial departures from that ideal exist in the systematic shift of benefits away from the poor toward the social classes to which most doctors, lawyers and researchers belong. At the same time that these privileged classes are denying the benefits of public health to the lower classes, they are shifting to them most of the costs and risks involved. 

The rich and poor today, as in the nineteenth century, find themselves living within a network of ideological, social and productive relations from which no one can completely escape. This is particularly true of those who work within the medical care system, because it intertwines with so many of our social and political institutions, and reflects so many of our political and social givens. This being the case, clinical research and experimentation with human subjects reflect those trends evident in forced conscription and in the structure and ideology of preventive medical services. 

One must ask why poor, city-dwelling Spanish-speaking territorial subjects were selected for the field trials of a drug most easily studied in middle class English-speaking American suburbanites.

In a vigorous defense of clinical research given in 1969,8 Dr. Frances D. Moore noted that in ethically done research it is crucial that “those selected for therapeutic innovation represent the full spectrum of the hospital population and not just a group for whom recourse would be scanty.”9 He goes on: “At the present time we are engaged in one of the largest human experiments … ever considered: the widespread use of oral contraceptives. It has been estimated that more than 25 million women have taken these tablets and that at any one time 15 million women are taking them.” But because oral contraceptives were given to normal individuals to prevent a normal occurrence, he went on, it was especially important that the evaluation of oral contraception “be even more free of taint than inovations involved with the treatment of disease.”10

“The pill” is neither the perfect contraceptive, nor is it 100 percent effective. High motivation and good understanding of a complex regimen are key factors in the pill’s efficacy. As Dr. Hugh Davis of the Johns Hopkins University School of Medicine has remarked, “It is the suburban middle-class woman who has become the chronic user of the oral contraceptive in the U.S. in the last decade, getting her prescription renewed month after month and year after year without missing a single tablet.”11 Effectiveness, acceptance, and proper use all fall off as researchers and clinicians try to study or prescribe the “pill” across cultural, socio-economic, or language barriers. All this seems predictable and obvious. Yet the first field trials were done on poor Puerto Rican women in San Juan and Humacao.12 The San Juan study involved women in a low-income housing project in a slum clearance area. The researchers’ first act was to get on the “good” side of the superintendent of the project, a male, who had great enthusiasm for their work and extended more than full cooperation. In the Humacao study the data were analyzed in Boston at the Harvard School of Public Health, and cervical biopsies, used to gauge a drug’s carcinogenic (cancer-causing) potential, were sent to the Free Hospital for Women in Brookline, Mass. Why, then, didn’t the patients also come from this area? 

Although the pill was judged to be highly effective with low hazard in these studies, one must ask why poor, city-dwelling Spanish-speaking territorial subjects were selected for the first two extensive field trials of a drug regimen most easily and most appropriately studied on middle-class English-speaking American suburbanites? The answer rests not in evil intent, but more in “That’s how things get done.” The whole complex system of social checks and balances which is supposed to ensure equal opportunity for both benefit and liability, in fact conspires to ensure a systematic departure from that ideal. 

Although those studies were done in the late fifties, the same thing, of course occurs today. A 1970 study sponsored by Syntex Labs and the U.S. Agency for International Development is a striking case in point.13 In an effort to discover whether the many minor but annoying side effects of the pill were real or imagined, a double-blind randomized study using active oral contraceptives and placebos (sugar pills) was done on 398 women. The women, of course, were not told about the placebos, but instead were instructed to use a vaginal foam “until we’re sure your pill is effective.” Eleven pregnancies resulted in the unprotected group, possibly because of lax precautions with the foam, possibly because foam just isn’t very effective. This is clearly a study where proper understanding and good ability to communicate subjective symptoms and complaints are most important in achieving optimum results. Yet who were these subjects? Most of them were poor Mexican-American mothers who had come to the Planned Parenthood Clinic in San Antonio seeking contraceptive assistance. 

If in fact systematic class bias does exist in the realm of experimentation with human subjects, doesn’t this constitute a major flaw in the present ethics of clinical research? I believe that substantial bias does exist. The question seems to have been largely ignored in the discussion of human experimentation to date. Clearly, what is needed is a thorough examination of the class nature of human subject research both in the past and for the future. 

This too is research with human subjects in a sense. But it is research that recognizes that even the sterile operating field of the clinic exists within a social context that has set certain preconditions before the experiment even begins.

>>  Back to Vol. 9, No. 2  <<

REFERENCES

  1. lngelfinger, F.J., “Ethics of Experimentation on Children,” New England Journal of Medicine, (April 12, 1973), Vol. 288, No. 15, pp. 791-2.
  2. Ibid., p. 791.
  3. Ibid., p. 792
  4. For an illuminating discussion of 19th century public health see: Rosenkrantz, Barbara G., Public Health and the State, Harvard University Press, Cambridge, Mass., 1972, Ch. 1, 2.
  5. For example, in 1850 Lemuel Shattuck wrote in his famous Report of the Sanitary Commission that good health was within reach of all who wished to have it, had they just the strength of character to live rightly. In reviewing the Shattuck Report, the prestigious North American Review noted: The “sanitary movement does not merely relate to the lives and health of the community; it is also a means of moral reform … Outward impurity goes hand in hand with inward pollution, and the removal of one leads to the extirpation of the other.” Ibid., p. 34.
  6. Hinman, A.R. “Resurgence of Measles in New York,” American Journal of Public Health Vol. 69 (1972), p. 498.
  7. Ibid.
  8. Moore, Frances D., “Therapeutic Innovation: Ethical Boundaries in the Initial Clinical Trials of New Drugs and Surgical Procedures,” in Experimentation with Human Subjects, Paul A. Freund (ed.), The Daedulus Library, George Brasillier, N.Y. 1970, pp. 358- 378.
  9. Ibid., p. 376.
  10. Ibid., p. 363.
  11. Testimony of Dr. Hugh J. Davis, Assistant Professor of Obstetrics and Gynecology, The Johns Hopkins University School of Medicine, January 14, 1970, Nelson Committee extract reprinted in Katz, Jay, Experimentation with Human Beings, Russell Sage Foundation, N.Y., 1972, pp. 770-771.
  12. Ibid., pp. 739-744.
  13. Ibid., pp. 791-792.