From the time of the Roman Empire until well after the discovery of the tuberculosis bacterium in 1882, many of the best medical minds believed that "miasmas"—invisible vapors emitted from the earth—caused killer infections such as typhus, diphtheria, and malaria. Though the bacteriological revolution of the late nineteenth century routed that theory, a new miasma theory has lately sprung up in schools of public health, holding that racism and sexism, though as unmeasurable as the ancient miasmas, cause AIDS, cancer, drug addiction, and heart disease. Indeed, according to public health professors, living in America is acutely hazardous to women and minorities, so shot through is the United States with sickness-producing—even fatal—injustice and bigotry.
You might be inclined to dismiss such a claim as just one more silly but harmless emanation from the ever more loony academy. Trouble is, government health agencies such as the Centers for Disease Control and Prevention (CDC) and the National Institutes of Health (NIH) take the academic miasmaticians very seriously, funding their activities and busily investigating on their own the health effects of "patriarchy" and racism. Though such politicized inquiries divert money from needed health education and research, their most pernicious effect is on the concept of individual responsibility, once a cornerstone of public health efforts. Government and academic miasmaticians now argue that members of designated victim groups are incapable of controlling such destructive behavior as promiscuous unprotected sex and intravenous drug use. In other words, some of the very people who claim to be solving public health problems have embraced an ideology that can only make them worse.
The command center of the modern miasma movement is at the Harvard School of Public Health, with Associate Professor Nancy Krieger at the helm. Shown on her Web page sporting a tie and vest (a preemptive strike, no doubt, against the patriarchy), Krieger is a magnet for government money, which underwrites a flood of her articles on racism, sexism, and health. In 1996, major national newspapers reported Krieger's research, funded by the National Heart, Lung, and Blood Institute, claiming that racism causes hypertension in blacks. Never mind that her data showed no correlation between bias and blood pressure: working-class blacks who reported no biased treatment had the highest blood pressure, for example, while those who did report discrimination had the lowest. These results simply prove the existence of "internalized oppression," according to this master miasmatician. Blacks who say they are not discriminated against are in fact the most victimized of all, because they have been brainwashed into denying their oppression.
Behind all their talk of racism and sexism, Krieger and her colleagues' real prey is individual responsibility. Traditional epidemiology looks at both individual risk behaviors and the environment to determine the source and pattern of disease. But the modern miasmaticians assert that to study individual risk behaviors, such as drug use or smoking, "blames the victim"—at least when that victim is poor, female, or black. When such victims get sick, society is to blame. And so the public health revisionists are generating a remarkable body of excuses for the most avoidable and dangerous behaviors, particularly those relating to HIV/AIDS.
Typical is an article by Nancy Krieger and Sally Zierler, another prolific radical teaching at Brown's School of Public Health. Writing in the 1997 Annual Review of Public Health, the authors argue that "enormous public force," rather than their own bad decisions, causes minority women to get HIV. To Krieger and Zierler, HIV is like an airborne disease; if you're a woman living in a certain neighborhood, you have little choice but to become infected.
The first element of "public force" that makes women get HIV, according to Krieger and Zierler, is . . . Ronald Reagan. Reagan's military buildup and tax cuts for the rich created the conditions for the spread of HIV; the racism of "white Europeans" fanned the epidemic, for racism makes women use drugs, a high-risk behavior for HIV. "In response to daily assaults of racial prejudice and the denial of dignity," they write, "women may turn to readily available mind-altering substances for relief." Racism also causes promiscuous unprotected sex. "Seeking sanctuary from racial hatred through sexual connection as a way to enhance self-esteem . . . may offer rewards so compelling that condom use becomes less of a priority," the authors explain.
Since individual women have no control over whether they get HIV, public health officials should not seek individual behavior change. Rather, the authors demand that government assist in racial and gender "empowerment," by—among other odd new public health strategies—increasing racial pride and awareness of ancestry among blacks, monitoring the race of elected officials, and examining "sexual fulfillment" and "sexual identity" among women.
The facts on which the miasmatics base their belief system can be remarkably flimsy. Take for instance the assertion of Margaret Ensminger, a professor of health policy and management at the Johns Hopkins School of Hygiene and Public Health, that there is "not a lot of evidence to support the idea that if poor people changed their behavior, their health would get better." Her evidence to the contrary includes the Upstairs, Downstairs television series, in which the wealthy flee the plague, and the death rate on the Titanic—2 percent of first-class female passengers and 44 percent of lower-class women, she alleges. Even if such "evidence" were not partly fictional, it is irrelevant to the question of whether the behavior of today's poor helps cause their health problems.
Other academic miasmaticians try to generate more conventional proof that the American economic and social system makes people lethally sick. A study in the June Journal of the American Medical Association is typical. A group of public health and sociology professors at the University of Michigan, with funding from the National Institute on Aging, looked at whether smoking, drinking, sedentariness, and obesity explained the earlier mortality among the poorer and less well educated compared with the better-off. Having found that these four behaviors do not explain all the difference, the researchers seize gleefully on their desired results: "heightened levels of anger and hostility" resulting from economic inequality, along with the "stress of racism, classism, and other phenomena related to the social distribution of power and resources," are killing people. Hence, public health officials should focus not on individual risk behaviors to improve health but rather on the far more important health effects of "socioeconomic stratification." As in much miasmatic research, the leap from data to conclusion is premature. The Michigan researchers leave unexamined a host of relevant risk factors, including drug use, sharing unclean needles, promiscuity, violence, diet, taking medication reliably, seeking medical care when needed, and genetic predisposition to disease.
In fact, the evidence for the paramount role of individual behavior in health is overwhelming. A CDC study from 1977, before the onset of modern miasmatism, estimated that "lifestyle" plays more than twice the role in premature death than does environment—50 percent of premature deaths result from behavior, compared with 20 percent from genes, 20 percent from environment, and 10 percent from inadequacies in health care. More recently, former secretary of Health and Human Services Louis Sullivan estimated that improving health behaviors around just the top ten causes of death could cut premature deaths among blacks by 40 percent to 70 percent.
Even the miasmatics' own research clearly demonstrates the relevance of behavior to health. Take a study by Brown's Sally Zierler comparing the risk factors for rape with those for HIV in women, a study funded by the National Institute of Allergy and Infectious Disease. Zierler's goal is explicit: she seeks to "shift the burden of responsibility [for avoiding HIV], and thereby public health policy, from women as individuals to their broader social context."
Yet Zierler's data demonstrate a strong correlation between behavior and risk. Women with HIV were more likely to report adult rape if they had had sex before the age of 15 rather than waiting until at least age 18, if they had an average of three or more partners per year, or if they were bisexual. Clearly, the greater the promiscuity, the greater the chance of rape, a conclusion borne out by another of Zierler's data: rape is five times more likely among adult women with HIV who reported a previous sexually transmitted disease. Sexual history, in other words, does matter on the question of rape, especially since most rapists know their victims. Alcohol and drugs matter, too. The study showed that using them increased the chance of rape, and women with HIV who first injected drugs before the age of 16 were 11 times more likely to report rape than were women with HIV who never injected drugs.
Of course, Zierler would not accept her data's inescapable message that behavioral choices count, because, to her, shooting drugs and engaging in promiscuous sex are not choices but unavoidable responses to poverty and racism. Her conclusion? Men are to blame for HIV: "Studies are needed that investigate the role of men as sexual partners," she decides, "and more generally, as people who shape the conditions and impose the experiences that increase women's exposure and susceptibility to HIV and its morbidity."
Even though the miasmatic dismissal of behavior's role in health rests on a purely ideological, rather than scientific, basis, the academic miasmaticians have easily convinced government health officials that individual responsibility is a sham. In a recent article on AIDS in the British journal Social Science and Medicine, Carolyn Beeker, a research sociologist at the CDC, argues that "virtually no behavior is under the complete and voluntary control of individuals."
At a philosophical level, Beeker may well be right. But she is not referring to the conditions for free will but to such mundane issues as "talking to a partner about condoms, avoiding anal intercourse, or leaving a sexually abusive situation." These behaviors, Beeker asserts, are not "isolated voluntary acts, but part of socially conditioned, culturally embedded, economically constrained patterns of living."
This doctrine is particularly pernicious regarding AIDS. In all but a few tragic cases, HIV is communicated by very specific individual behavior—frequent unprotected intercourse with infected partners, anal intercourse, shooting drugs with infected needles. It lies within an individual's power to avoid the disease, and public health efforts should focus on changing behavior, on the one hand, and on protecting the public through testing and partner notification, on the other—traditional public health measures that have been all but discarded for AIDS.
But Beeker and many of her colleagues at the CDC have no such tried-and-true responses in mind. The solution to AIDS, they say, lies not in behavior change but in nothing less than a revolution in male-female relations. The CDC has targeted for eradication "gender roles which define women as subordinate to men." Beeker also decries "women's dependence on their partners for sexual satisfaction." No wonder the public health profession was sad to see Jocelyn Elders, President Clinton's onanism-promoting surgeon general, booted from the job.
In this climate, it's not surprising that many government HIV "interventions" look more like women's studies consciousness-raising sessions than anything the founders of public health would recognize. The CDC, drawing on its annual $2.5 billion stream of tax dollars, funded an "intervention" for young pregnant poor women to promote their sense of "communal mindedness" and "enhance their negotiating skills." Beeker admits that it was "not clear" whether the participants' "communal competence" actually increased—nor, according to Beeker, does any evidence exist that "community empowerment" reduces risk better than a behavioral approach. Still, "empowerment" models continue to flourish.
The government has snatched up leading aca-demic miasmaticians for research on racism and health. The CDC and the National Center for Health Statistics, for example, have given Harvard's Nancy Krieger $92,392 for an ongoing study of the social determinants of cancer for four different racial and ethnic groups. The National Institute for Child Health and Human Development is about to fund her study on how to include socioeconomic data in routine health statistics. Krieger, who often seems like the government's official voice of miasmatism, wrote up a 1994 National Institutes of Health conference called "Measuring Social Inequalities in Health" for the leading federal public health journal, Public Health Reports. Hitting all the major miasmatic themes, her article reported that the conference recommended collecting socioeconomic data as part of routine health reporting, so as to show "how the economic structure of the United States, and not simply individual behaviors or 'lifestyles,' " underlies racial and ethnic differences in health. But not just any socio-economic data would do. "Most measures of socio-economic position have been based upon the model of the white European heterosexual nuclear family," Krieger complained. Since such measures ignore "nontraditional (such as lesbian and gay) households," they are patently inadequate.
Such contemporary public health initiatives as needle exchanges for drug users, condom distri-bution in schools, and the war on Big Tobacco embody the miasmatic assumption that individuals, especially from designated victim groups, have no control over their self-destructive actions. The American Public Health Association, the largest such group in the world, has lobbied Washington for federal funding of needle exchanges; President Clinton's Advisory Council on HIV/AIDS, a miasmatic hotbed, huffily condemned the administration's recent reluctant decision to stay out of the needle business. The administration's silence, the council said, was "particularly shameful" in light of the health disparities among racial groups. In other words, went the implication, blacks can't be expected to refrain from drug use and so should instead be helped to use drugs "safely"—a remarkable perversion not only of public health traditions but of much else.
Even without federal funding, needle exchange thrives in cities across the country. Along with needles, Bridgeport, Connecticut, also passes out leaflets helpfully advising drug users how to smoke crack correctly and suggesting a temporary "slowdown" if the user starts coughing up "dark stuff." Of course, officially sanctioned free needles often end up back in circulation on the streets, when users sell them for more drug money, according to Rodney Hopson, a federally funded health researcher at Duquesne University.
The public health profession's mania for showering condoms upon Americans, from schoolchil-dren on up, reflects the same rejection of individual self-control. A 1995 editorial in the American Journal of Public Health called "our society's failure" to place condom vending machines in convenience stores and public bathrooms and our failure to encourage "aggressive marketing of condoms" a "national tragedy." As emeritus professor Monroe Lerner of the Johns Hopkins School of Hygiene and Public Health argues, the federal public health bureaucracy now assumes that people have "no impulse control, no sense of personal responsibility." It expects young people in particular, he says, to engage in "promiscuous sexual relations." As for the schools of public health, "don't expect any plea from them to observe a more traditional morality, where young people don't go to bed with each other before they are married." Lerner, who helped push the field into more political arenas during the 1960s, a development he now regrets, acerbically summarizes the past three decades of public health thinking: "The words sin and deviance have vanished from the vocabulary."
From its inception, of course, public health has had a special concern for improving the health of the poor. But the miasmatic exemption of the poor from individual responsibility is a dangerous new twist. "There's new elements in the discourse," enthuses CDC community psychologist May Kennedy. "The commitment to social justice at the CDC is just now becoming explicit." Expressing the same view is Paul Geltman, a professor of pediatrics at Boston Medical Center and an advisor to the Massachusetts Department of Health. "[There's] absolutely been an increase in political consciousness in public health in recent years," he says.
Why? One reason is the ever larger number of sociologists, community psychologists, and anthropologists now in the field. Lacking medical or scientific training, they see public health as a vehicle for social change. "Those of us who were activists in the 1960s are now professors," Brown's Sally Zierler, herself a doctor of public health, explained in an interview. "This is a way of continuing the work." Zierler and her radical colleagues use their academic credentials to "authorize" themselves, she says: now "we are working from the inside." Zierler marvels over her group's ascension: "In the 1950s we would have been blacklisted," she says. "We couldn't have had the agenda we have and be hired."
But an equally important reason for the rise of the miasmaticians is the dominance of identity politics in every other area of public discourse. In declaring that racism and sexism determine the very fundamentals of life for women and minorities, the miasmatics parrot their colleagues in the rest of the academy. It was just a matter of time before public health picked up the jargon and conclusions of multiculturalism.
Not that the miasmatics have completely swept the field. At the opposite end of the public health spectrum are the genetic researchers, with their ever more impressive breakthroughs in finding the genetic and molecular determinants of disease. The geneticists are anathema to the miasmatics, because finding genetic correlates for disease obviates the need for the gender and race revolution. Midway between the geneticists and the miasmatics is "risk factor epidemiology," which studies the relation of both individual behavior and environmental factors to disease. Though far less precise in its causal conclusions than genetic epidemiology, risk factor "epi" is nevertheless a model of scientific rigor compared with the miasmatics.
Few traditional epidemiologists will publicly challenge the race-and-gender crowd. Even so, it is not hard to detect a certain chip on the miasmatic shoulder. "The gatekeepers of epidemiology," Zierler says scornfully, "are white male M.D.s"—three of the most damning words in a miasmatician's vocabulary. The miasmaticians claim, quite falsely, that all the power and funding lie with high-tech genetic epi. "One of the issues in this environment," explains Denise Herd, a professor of Multicultural Health at Berkeley's School of Public Health, "is that medicine is the elite field. Sometimes you get socially conscious doctors, but if not, biological science is perceived as more relevant to public health than social science."
Nothing could be more odious to a miasmatician than a preference for hard science. To them, "biomedical" is a term of derision, for it implies a focus on an individual human being (rather than on power relations in society) by a detached observer with implicitly superior knowledge. This is far too individualistic, Western, and male a concept for the miasmaticians. Science's status, according to Harvard's Nancy Krieger, grows out of cold war paranoia and McCarthyism rather than out of its breathtaking intellectual insights. The CDC's Carolyn Beeker recommends as an antidote to this repressive scientism that the researcher act as "advocate, collaborator, or mentor"—in other words, as political activist.
Nothing better demonstrates the miasmatics' contempt for traditional science than the citations in their published research. A partial citation list for an article in the American Journal of Preventive Medicine by Nancy Krieger and Diane Rowley, an assistant director for science at the CDC, includes (among more traditional material): strident feminist bell hooks [sic] ("Ain't I a Woman: Black Women and Feminism"); equally strident and unscholarly literary theorist Michelle Wallace ("Black Macho and the Myth of the Superwoman"); novelist and race critic Toni Morrison; lesbian poet Adrienne Rich; radical feminist commentator Barbara Ehrenreich; and The New Our Bodies, Ourselves: A Book By and For Women, by the Boston Women's Health Book Collective. What scientific evidence do Krieger and Rowley submit for the completely unempirical charge of widespread "racism in medical school"? A "personal communication": perhaps, in other words, a whine from a friend. Equally revealing is what counts as noncontroversial fact, needing no footnote: Krieger and Rowley state that in the U.S., "women are routinely treated as sex objects and face the daily harassment of street remarks." How do they know?
This cavalier approach to fact pervades all miasmatic research on racism and sexism. The CDC has so far spent roughly $3 million conducting a three-city "ethnographic" (i.e., nonscientific) study of black women's experiences with racism and sexism. The study uses the Krieger fail-safe method for finding racism: if a black woman says she has been discriminated against, she has. If she says she has not, she really has, because that means she has low levels of racial pride, a sure sign of discrimination.
What about misperception problems? I asked Diane Rowley. What if that allegedly racist bank teller or welfare worker treats everyone brusquely, or what if the allegedly brusque behavior is simply businesslike? No problem—the CDC falls back on folk wisdom. "There's a saying," Rowley explains: "Black people are paranoid, but in most cases it is justifiable paranoia."
The miasmaticians' blame-the-male research manifests equal indifference to fact. Hortensia Amaro of Boston University's School of Public Health used a National Institute on Drug Abuse grant to show that men are to blame for involving women in drugs. She interviewed 35 drug users, but her resulting psychohistories hardly support the image of the helpless or put-upon female.
For example, Lisa, a 20-year-old mother of two, says: "Well, getting high, I've always kept a drug dealer next to me. I mean, I got my kids' father, who I love, but he won't give me drugs. And I love my drugs, too. And drugs just means a little bit more to me than he did." A mother of three with a $100-a-day heroin and cocaine habit used to get money from her U.S. veteran partner. "But I was still hustling, you know, lying, stealing, and cheating, and prostituting with him, but he didn't know it. . . . I got to the point where not only I was stealing out of the store, I was robbing people, um, I was prostituting. You know, I was doing things I said I would never do." Other women admit to staying with men they don't like just to get drugs. One woman introduces her mother, a former heroin user, to cocaine; others start using drugs not with men but with their girlfriends.
Such morally equivocal stories do nothing to shake Amaro off the trail. She concludes bathetically: "Men who go to jail [as had 49 percent of the women in her study], men who do not take care of them or their children, and men who disappoint them fill the lives of these women." Therefore, gender relations need to change before women can be expected to avoid drug use. But of course, these women should have disappointed themselves by their reckless behavior toward their children. To blame men for their predicament is a moral dodge.
What does the government intend to do with all the treatises on sexism, racism, and health it keeps subsidizing or authoring? Its intentions are unclear. But those of academic miasmaticians are more definite. "I want . . . people [to] understand that they're complicit in oppression," Brown's Sally Zierler announces. Zierler wants everyone to acknowledge his role in making poor people get HIV and use drugs: "It can be transformative to realize your complicity," she says: ideally, you go through a "hierarchy" from "guilt to anger to real commitment." And then, with a whole cadre of people who understand that individual responsibility is an oppressive concept, who knows what major societal changes will be possible?
All this is a far cry from the public health profession's distinguished past, with its multiplicity of practical improvements to the quality of ordinary life. As new populations flooded into the cities in both Europe and America in the nineteenth century, urban aristocrats and plutocrats looked around and said, "The poor are living in wretched and often lethal conditions; we have a responsibility to improve those conditions." The middle- and upper-class movements for sanitary and housing reform—creating air shafts in crowded tenements and closed drainage systems in the streets—and for unadulterated milk and clean water represented the first great wave of Victorian reform and a monumental advance in social consciousness. In subsequent decades, public health pioneers would zealously sleuth out the source of infectious disease in swamps and slums, often falling ill themselves, while spreading the gospel of cleanliness.
The early sanitary reformers on both sides of the Atlantic recognized that new sewage and drainage systems, though essential to removing the lethal "miasmas," could not alone solve sickness and early death in the slums. The habits of the poor needed changing, too. John Simon, London's first medical health officer, expressed sentiments typical of the time in his 1848 First Annual Report: "Among the influences prejudicial to health in the City of London, as elsewhere, must be reckoned the social condition of the lower classes," he wrote. "The filthy, or slovenly, or improvident, or destructive or intemperate, or dishonest habits of these classes, are cited as an explanation of the inefficiency of measures designed for their advantage. . . . It is too true that, among these classes, there are swarms of men and women who have yet to learn that human beings should dwell differently from cattle; swarms, to whom personal cleanliness is utterly unknown." While the city fathers must improve public sanitation and the like, he concluded, no real improvement in the health of the slums is possible without "improving the social condition of the poor."
That meant not just economic improvement but, equally important, a change in behavior. In France, René Louis Villerme, a member of the hygiene department for the French Royal Academy of Medicine, similarly called in 1821 for moral regeneration, not the redistribution of wealth, as a key to improving the health of the poor. In the same vein, American statistician Lemuel Shattuck argued in his 1850 Report of the Sanitary Commission of Massachusetts that drunkenness and sloth among the poor were destroying their health. Since such lack of personal responsibility puts everyone at risk, he concluded, it was as much the state's duty to raise the moral level of the poor as to build the infrastructure for civic cleanliness.
The science and technology of public health grew far more sophisticated in the late nineteeth century, but the moral tone of the field persisted. An 1883 article in the Journal of the American Medical Association declared: "Public health . . . is the companion of orderly habits and pure morals." No one doubted the individual's responsibility to practice elementary sanitation, for many of the most important public health reforms continued to be low-tech behavioral change. Early in the twentieth century, the Public Health Service eliminated typhoid in the rural South with its clean privy campaign, cut death rates from infectious diseases by advocating trash can lids to keep out flies, and ended trachoma, a highly contagious, blinding eye infection, in Appalachia by the advocacy of soap and water and separate towels for different family members.
Many public health pioneers had a fervent commitment to improving social welfare. But they would have found it inconceivable to argue that seeking individual behavior change is against the interest of the poor or nonwhite. Too much evidence existed of the importance of personal habits to health, and the field has usually respected evidence.
So the pioneers would be especially horrified by the modern miasmatic policy toward sexually transmitted diseases, which, unlike the epidemic diseases of the past, still plague an astoundingly high number of Americans. Two principles—the protection of the public and the advocacy of individual restraint—governed public policy regarding venereal diseases for most of this century, up until the advent of AIDS. During World War I, for example—when venereal disease constituted such a military disaster that, given the option of eliminating all wounds or eliminating VD, every army commander would choose to wipe out VD, according to then-Surgeon General W. C. Gorgas—the only real hope was changing individual behavior. As Gorgas concluded: "It is the individual action and the individual beliefs of our people affected that are finally going to control the disease." Hence, he advised, the sexual morals of the male population must be elevated to the same plane as that of the female population.
In addition, the army tried hard to protect the public by monitoring behavior. It tested extensively, it searched out and treated all the sexual contacts of the infected patient, and it isolated the patient for treatment, methods that for decades were considered the best public health practice.
The government-sponsored All-America Conference on Venereal Diseases in 1921 similarly exemplifies the traditional public health world view. The conferees rejected the contention of the Freudian psychological section that, because "disastrous consequences [follow from] repression of the sex instinct," the conference should avoid recommending continence as a disease-preventing measure. The conference committee, while noting "the complexity of the question of the relation of continence to the total well-being of the individual," nevertheless resolved that "the dangers and disadvantages to the individual and the race which ensue upon the infringement of continence in the unmarried man or woman are so serious that they outweigh the possible physiologic disadvantages of sexual abstinence." After all, the conferees agreed, "the prevention of contact between infected and uninfected individuals is the first principle of prophylaxis" (i.e., prevention).
However obvious this "first principle" may seem, the prevention of sexual contact between infected and uninfected individuals plays no part in today's AIDS prevention efforts. Public health authorities have never dared suggest that infected people should refrain from intercourse with the uninfected, or that the infected should exercise sexual restraint. Equally outside the realm of today's public health discourse is the All-America Conference's view that if an infected person is likely to continue spreading the disease, the doctor's duty of confidentiality is over. At that point, resolved the conference, the doctor must "exercise whatever means are at his command" to protect the public health—presumably, quarantine (a traditional, if extreme, public health measure) or publicizing the patient's infected status.
But today, at least 40 percent of persons infected with HIV do not tell their sex partners of their status, and nearly two-thirds of those do not regularly use condoms—suggesting that the government's laid-back prevention philosophy is not working. Yet until recently, public health authorities placed patient confidentiality far above their duty to protect the good of all. Caving in to gay activists, they abandoned the elementary practice of reporting the names of the infected to a central registry; and they have never emphasized the individual's obligation of disclosure. The CDC stopped testing newborns for HIV several years ago, when threatened with legislation that would require it to notify mothers if their babies tested positive. The CDC put the mother's rights of confidentiality above the innocent baby's need for immediate treatment, a reversal of public health traditions. Only a political backlash against "AIDS exceptionalism" —the exempting of AIDS from conventional public health measures—has begun to return some common sense to AIDS policy through legislation.
As a result of aggressive public health sleuthing and the social stigma against promiscuity, VD was way down by the late 1940s. But starting in 1957, on the eve of the sexual revolution, it began to rise. Today, the U.S. leads the industrialized world in sexually transmitted diseases, with 10 million to 12 million new cases a year. And the response of the public health community is, in essence, Way to go! Asked whether we should recommend abstinence to prevent AIDS and other sexually transmitted diseases, Boston University's Hortensia Amaro says: "No, that would be to shut down the voice, the internal voice of sexuality." That Amaro advises the Massachusetts Department of Health on HIV prevention does not give one hope for ending the problem anytime soon in that state.
Like Amaro, the CDC and indeed the entire public health profession are big believers in the "internal voice of sexuality." A CDC poster at the 12th World AIDS Conference in Geneva this July announced incredulously that "only" 56 percent of teens interviewed in a phone survey in three American cities said they had a condom with them at the time of the interview, and only 32 percent reportedly took a condom with them the last time they left the house. And at last year's annual American Public Health Association conference, a researcher from the Medical College of Ohio in Toledo reported her surprise when, in a survey of college freshmen at an unnamed southeastern state university, "only" 61 percent of respondents reported having had "voluntary intercourse." Struggling for an explanation, she hypothesized that they must have come from a "fundamentalist background."
In perhaps the most momentous reversal of historic public health practice, some modern miasmaticians view traditional morality as the very cause of sexually transmitted disease. In a remarkable article in the June American Journal of Public Health, a group of sociologists at the University of Chicago blame Americans' disapproval of premarital, extramarital, and homosexual sex for our high sexually transmitted disease rate. Such regressive attitudes impede effective public health campaigns, the authors claim.
Schools of public health are doing all they can to combat such dangerous sexual morality. For example, the public health department in New York University's school of education offers a whole course on "Alternative Lifestyles." The aim of the course is to question the "personal feelings" individuals may have toward non-traditional families and "intimate living relationships." Just to make sure students understand the backwardness of the heterosexual family, the department provides a cradle-to-grave curriculum on homosexuality, from "Gay and Lesbian People: Adolescents" to "Gay and Lesbian People: Aging." The agenda of these courses is politics, pure and simple, focusing on how "professionals may assist gay and lesbian people in affirming their identities, securing their rights, and coping with stress." Will anyone discuss the health risks of the promiscuous homosexual "lifestyle"? Unlikely.
The government's health agencies are also doing their best to counter regressive old-fashioned attitudes. The National Institutes of Health have a whole set of guidelines on the recruitment of lesbians as subjects for clinical research, part of a spurious and wastefully expensive congressional mandate to ensure gender and race "diversity" in clinical trials. The lesbian guidelines, like the race and gender quotas, turn out to be nothing more than a Full Employment for Lesbians Act. According to the NIH, researchers must "reassure" prospective lesbian trial participants "that lesbians have been involved in the writing of the project and analysis of data." Researchers should also "include 'out' lesbians in all staff and research levels and establish a local lesbian advisory committee for further advice."
Given these views, it's no surprise that the federal public health bureaucracy's prevention strategies for HIV and other sexually transmitted diseases are so muddy, involving "skill-based training that increases, through modeling and practice, decision-making and communication skills that support reduction of sexual risk behaviors," as the CDC puts it. In this characteristic verbiage, only one thing is clear: the CDC is not about to call for testing, monitoring, partner notification, or, heaven forbid, abstinence. For the CDC, the teen "sexual partner" is a given; the only negotiable part is the amount of "communication" one has with that "partner."
The CDC has spent approximately $12 million on a five-year, five-city teen sexually transmitted disease prevention project, featuring condom demonstrations on dildos in a church in Nashville and condom ads on buses and billboards. A layman might assume that the CDC would measure the success of the program by changes in the sexual disease or pregnancy rates. But no: instead, the agency measures self-reported "risk-reduction" behavior. Yet surely, the bottom line for $12 million in tax dollars ought to be actual disease and pregnancy reduction, not merely whether condom use increased.
What really excites public health professionals today is not reducing teen pregnancy, and certainly not reducing teen sex, but "empowering" girls. I asked Andrea Solarz, a community psychologist at the Institute of Medicine, a division of the National Academy of Sciences, about the institute's teen pregnancy prevention approach. "It's not 'Just say no,' " she replied. "We're more likely to do an intervention that empowers teens to negotiate the process of decision, that empowers girls to make the choices they want to make." If that means intercourse, fine.
The fatuousness of the miasmatics should not obscure the continued importance of public health, from the unsung labors of municipal health departments in testing water supplies and monitoring infectious diseases to the fight against terrifying new antibiotic-resistant bacteria in hospitals. Even in such politically conquered institutions as the CDC and NIH, serious, vital science is still being done. But the field increasingly identifies itself by the most radical elements within it. The keynote speaker at the American Public Health Association's annual meeting this November will be Jesse Jackson; the association's miasmatic caucuses—from socialist to lesbian, gay, and bisexual—already plan a show of force. This self-indulgent pursuit of a gender and race revolution squanders the great legacy of public health, whose most enlightened practitioners sought to balance public and private responsibility for health.