There have been at work among us three great social agencies: the London City Mission; the novels of Mr. Dickens; the cholera. Historian Gertrude Himmelfarb quotes this reductionist observation at the end of her chapter on Charles Dickens in The Moral Imagination; her debt is to an English nonconformist minister, addressing his flock in 1853. It comes as no surprise to find the author of Hard Times and Oliver Twist discussed alongside Edmund Burke and John Stuart Mill in a book on moral history. Nor is it puzzling to see Dickens honored in his own day alongside the City Mission, a movement founded to engage churches in aiding the poor. But whats V. cholerae doing up there on the dais beside the Inimitable Boz? Its being commended for the tens of millions of lives its going to save. The nastiness of this vile little bacterium has just transformed ancient sanitary rituals and taboos into a new science of epidemiology. And that science is about to launch a massiveand ultimately successfulpublic effort to rid the city of infectious disease.
The year 1853, when a Victorian doctor worked out that cholera spread through Londons water supply, was the turning point. Ordinary people would spend the next century crowding into the cities, bearing many children, and thus incubating and spreading infectious disease. Public authorities would do all they could to wipe it out. For the rest of the nineteenth century, they lost more ground than they gained, and microbes thrived as never before. Then the germ killers caught upand pulled ahead. When Jonas Salk announced his polio vaccine to the press in April 1955, the war seemed all but over. The time has come to close the book on infectious disease, declared William Stewart, the U.S. surgeon general, a few years later. We have basically wiped out infection in the United States.
By then, however, infectious diseases had completed their social mission. Public authorities had taken over the germ-killing side of medicine completely. The focus shifted from germs to moneyfrom social disease to social economics. As germs grew less dangerous, people gradually lost interest in them, and ended up fearing germ-killing medicines more than the germs themselves.
Government policies expressed that fear, putting the development, composition, performance, manufacture, price, and marketing of antibiotics and vaccines under closer scrutiny and control than any public utilitys operations and services. The manufacturers of these drugs, which took up the germ-killing mission where the sewer commission left off, must today operate like big defense contractors, mirror images of the insurers, regulatory agencies, and tort-litigation machines that they answer to. Most drug companies arent developing any vaccines or antibiotics any more. The industrys critics discern no good reason for this at all: as they tell it, the big drug companies just cant be bothered.
These problems capture our attention only now and again; they hardly figure in the much louder debate about how much we spend on doctors and drugs, and who should pay the bills. Public health (in the literal sense) now seems to be one thing, andoccasional lurid headlines notwithstandingnot a particularly important one, while health care is quite another.
We will bitterly regret this shift, and probably sooner rather than later. As another Victorian might have predictedhe published a book on the subject in 1859germs have evolved to exploit our new weakness. Public authorities are ponderous and slow; the new germs are nimble and fast. Drug regulators are paralyzed by the knowledge that error is politically lethal; the new germs make genetic errorconstant mutationthe key to their survival. The new germs dont have to be smarter than our scientists, just faster than our lawyers. The demise of cholera, one could say, has been one of the great antisocial developments of modern times.
By withdrawing from the battlefield just long enough to let us drift into this state of indifference, the germs have set the stage for their own spectacular revival. Germs are never in fact defeated completely. If they retire for a while, its only to search, in their ingeniously stupid and methodically random way, for a bold new strategy. Theyve also contrived, of late, to get human sociopaths to add thought and order to the search. The germs will return. We wont be ready.
Microbes discovered the joys of socialism long before Marx did, and in matters of health, they made communists of us all. Since the dawn of civilization, infectious disease has been the great equalizer, with the city serving as septic womb, colony, and mortuary. Epidemicupon the peopleis the democracy of rich and poor incinerated indiscriminately by the same fever, or dying indistinguishably in puddles of their own excrement.
The Mao of microbes was smallpox, which killed 300 million people in the twentieth century alone. Sometimes called the first urban virus, it probably jumped from animals to humans in Egypt, Mesopotamia, or the Indus River Valley, at about the same time that the rise of agriculture began drawing people together in towns and cities. Smallpox has also been called natures cruelest antidote to human vanity. Princes broke out in the same pustules as paupers, reeked as foully of rotting flesh, and oozed the same black blood from all their orifices. Alongside millions of nameless dead lie kings of France and Spain, queens of England and Sweden, one Austrian and two Japanese emperors, and a czar of Russia.
While the germs reigned, there wasnt much rest-of-medicine to speak of: infections eclipsed every other cause of illness but malnutrition. And when monarchs were dying, too, language and politics honestly tracked medical reality. The social in social disease reflected an epidemiological fact. It also pointed to a practical, collective solution. Disease arose and spread when people converged to create societies. It was caused by invisible agents that individuals could not control on their own. It could be eradicated only by social meanspublic sanitation, slum clearance, education, and, above all, a robust, germ-hating culture. It took a city to erase a cholera.
This was the overarching insight that crystallized in the public consciousness in the first half of the nineteenth century. In the Victorian version of the Puritan ethic, Himmelfarb writes, cleanliness was, if not next to godliness, at least next to industriousness and temperance. For Dickens, as Himmelfarb and others have observed, the filth in the Thames symbolized the citys insidious taint, its ubiquitous, effluvial corruption. What social historians often fail to note, however, is that by the time Dickens was placing the Thames at the center of Londons many ills, a new science had emerged to move the river far beyond metaphor.
Epidemiologythe rigorous science of public healthwas born with physician William Farrs appointment as controller of Londons General Register Office in 1838. Directed to do something about the cholera epidemic, Farr began systematically recording who was dying and where. The most important things he discovered were negative. Wealth didnt protect you from cholera. Neither did occupation, or residing close to the sea. What mattered was how high above the Thames you lived. Farr concluded that the rivers horrendous stench caused the disease. Another English doctor, John Snow, made the right connection in 1853: Londons sewers emptied into the Thames, so the farther down-sewer you lived, the more likely you were to drink foul water. A year later, Snow saved countless lives by persuading parish authorities to remove the handle from the Broad Street pump in Soho.
The rest is history. By pinning down the waterborne pathway of contagion, Farr and Snow had transformed a devastating public disease into a routine exercise in civil engineering. In 1858, Parliament passed legislation, proposed by then-chancellor of the exchequer Benjamin Disraeli, to finance new drains. Charles Dickens published his last novelOur Mutual Friend, in which the main character is the pestilential Thamesin 1864. London suffered its last cholera epidemic in 1866.
This wasnt the end of great plagues in the city, or even the beginning of the end, but it was the end of the beginning. In 1872, Disraeli rallied his Tory Party around what his Liberal opponents derided as a policy of sewagereforms involving housing, sanitation, factory conditions, food, and the water supplyand while he served as prime minister, these policies became law. For the next 50 years or so, in the United States as in Britain, public health depended on city bureaucrats above all. They wasted little time with sick patients, other than sometimes ordering them to lock their doors and die alone. They focused instead on eradicating germs before they reached the patient, and that meant attending to the water, sewage, trash, and rats.
In a recent British Medical Journal survey, public sanitation was voted the most important medical advance since the journal was established in 1840. If we dont think of public sanitation as medical any more, its only because the municipal bureaucrats who followed Farr cleaned things up so well.
As they ran out their welcome in public spaces, microbes went private. They still had to move from person to person, but there could be no more carefree joyrides on rats or surfing through the water supply. People themselves, however, are almost as infectious as their sewers. Clean water alone could not eliminate coughs, dirty hands, and filthy food.
The systematic pursuit of germs into the flesh of patients didnt really begin until the very late nineteenth century. Jenners smallpox vaccine was already a century old, but it owed its existence to the lucky fact that the human pox had a weak cousin that infected cows. (We give our kids the cow treatmentvacca is Latin for cowevery time we do as Jenner did, and challenge their immune systems with a corpse, cousin, or fragment of a horrible microbe.) The systematic production of other vaccines had to await the arrival of Louis Pasteur and Robert Koch, who developed procedures for isolating microbes and then crippling or killing them.
Vaccines, health authorities quickly recognized, are quintessentially public drugs. They expel germs not from the public water but from the not-quite-private lungs, fluids, and intestines of the public itself. When enough people get vaccinations, herd immunity protects the rest.
Five human vaccines arrived in the late nineteenth century, and many others would follow in the twentieth. They werent developed quickly or easily, but they did keep coming. In due course, and for the first time in human history, serious people began to believe that infectious disease might come to an end. Scientists could painstakingly isolate germs that attacked humans. Drug companies then would find ways to cultivate and cripple the germs, and mass-produce vaccines to immunize the public. Disease would fall, one germ at a time, and when they were all gone, good health would be pretty much shared by all.
New laws, vigorously enforced, drafted the healthy public into the war on germs. England mandated universal smallpox vaccination in 1853. Facing a smallpox outbreak, Cambridge, Massachusetts, decreed in February 1902 that all the towns inhabitants be vaccinated or revaccinated, set up free vaccination centers, and empowered a physician to enforce the measure. A certain Henning Jacobson refused to comply, insisting on a constitutional right to care for his own body and health in such way as to him seems best. The U.S. Supreme Court disagreed, easily upholding the power of a local community to protect itself against an epidemic threatening the safety of all. Later enactments would require the vaccination of children before they could attend public schools. Adults who traveled abroad had to be vaccinated if they planned to come home. Albert Sabins polio vaccine took things even furtherthe vaccine itself was infectious. A child swallowed a live but weakened virus soaked into a sugar cube, and then went home and vaccinated his siblings and parents, too.
The germ killers didnt really get into the business of private medicinecuring already-sick patientsuntil the development of sulfa drugs in the 1930s, followed by antibiotics after World War II. Even then, much of the cure still lay in preventing the infection of others. The paramount objective with tuberculosis, for instance, was to wipe out the tubercle bacillus so thoroughly that nobody would need streptomycin any more, because nobody would come into contact with any other infected person or animal.
Year by year, one segment or another of the public sector contrived to take a little more control, directly or indirectly, over the development, distribution, and price of vaccines. Soon after the development of the polio vaccine in the 1950s, Washington launched a program to promote and subsidize the vaccination of children nationwide. At about the same time, the Soviet Union proposed a global campaign to eradicate smallpox. The World Health Organization officially launched the campaign in 1966, and it ended in triumph 20 years later.
A complete socialization of the war against germs seemed sensible. The germs, after all, lived in the public water, floated through the public air, and passed from hand to hand in the public square. Individuals might buy their own vaccines, antibiotics, bottled water, and face masks. But collective means could make all of that forever unnecessary. And they did. Big government attacked the infectious microbes with genocidal determination, expelling them, one by one, from human society. Defiled by monstrous human fratricide, the first five decades of the twentieth century were also the triumphant decades of public health.
To stay prepared, however, human culture apparently requires regular booster shots of smallpox, cholera, plague, or some other serious disease that indiscriminately sickens and kills. Without periodic decimation, ordinary people apparently forget what germs can do, the authorities grow complacent, scientists turn their attention elsewhere, and private capital stops investing in the weapons of self-defense.
As Sherwin Nuland observes in How We Die, AIDS struck just when the final conquest of infectious disease seemed at last within sight. In 1981, a weekly Centers for Disease Control report noted a sudden increase in a specific strain of pneumonia in California and New York. Before long, we had it on Oprah Winfreys authority that the germs were back and were after us all. One in five heterosexuals could be dead of AIDS in the next three years, she declared in February 1987.
Whatever they were thinking about HIV, the heterosexuals had, by that point, plenty of other venereal diseases to worry about. A tragically large number of young women had contracted chlamydial infections serious enough to leave them infertile. Herpes, gonorrhea, syphilis, and some types of sexually transmitted hepatitis were also on the rise. The sexual revolution seems in retrospect to have been led by people who took William Stewart at his word when he consigned infectious disease to the dustbin of history. But rampant promiscuity packs people together tighter than slums, and germs rush in where angels fear to tread. It has taken a great deal of readily avoidable suffering and death to establish that people do need sexual taboostaboos at the very least robust enough to thwart microbes, if not with less sex, then with more latex.
As social agents go, however, HIV and chlamydia accomplished far less than cholera. It was the demise of a germ-hating culture that had helped clear the way for new epidemics of venereal disease, and the resurrection of that culture still has a long way to go. Many people in positions of authority and influence continue to affirm the tattoo artists expressive freedom, the bag ladys right to sleep next to the sewer, the mainliners right to share needles in an abandoned row house, and the affluent parents right to interpose a philosophical objection between his child and the vaccinations demanded by public schools. They propound grand new principles of freedom, privacy, and personal autonomy to protect septic suicide, even septic homicide. Social doctors in Dickenss day didnt have to invade anyones privacy to track smallpoxit announced itself on its victims faces. Tracking HIV, by contrast, requires a blood test, and privacy police dominated the first 20 years of the fight over testing.
A legal system that affirms the individuals right to do almost everything at the germ-catching end now struggles to decide when, if ever, we can force the Typhoid Marys of our day to stop pitching what they catch. The law that once ordered a healthy Henning Jacobson to roll up his sleeve can no longer compel a virulent celebrity to zip his fly. Infectious lifestyle, once a crime, is now a constitutional right.
Many people just dont care much, and its easy to see why. Habits and lifestyles that the Victorians learned to shun look a lot less vile when they lose not only their repulsive cankers, pustules, sputum, fevers, diarrhea, dementia, and emaciation, but also their power to impose these horrors on the neighbors. Just three months after Oprah warned heterosexuals about AIDS, President Reagan thought it necessary to remind us that we were battling a disease, not our fellow citizens. Everyone knew why. The gay community had good reason to fear that many Americans might be thinking: HIV isnt my problem, its theirs. The new choleras are indeed much less social than the old. Why shouldnt they forever remain so?
Over the morning coffee and toast, consumed in our tidy little kitchens, we read that drug-resistant tuberculosis is a cause for growing concernbut mainly in prisons. So too are new drug-resistant staph infectionsin tattoo parlors and the foulest of locker rooms. And its in private drug dens, bedrooms, and bathhouses, of course, that infectious germs have made their biggest comeback, contriving to get themselves spread by not-quite-private needles and genitalia. True, the germs incubated in abandoned houses, cardboard boxes, and other hovels have drifted into run-down urban hospitals, whose emergency rooms often provide primary care to the patients most likely to harbor the worst germs. But they havent moved much farther than that. Sharing the city, it seems, no longer means sharing smallpox and the plague.
The epidemiological factsbeyond serious disputesupport the complacent majority. Germs used to ravage young bodies with inexperienced immune systems; now they mainly take the old. Though death certificates still quite often record an infection as the final cause of death, germs now are mostly epitaph killers, moving in on immune systems terminally crippled by old age, heart disease, cancer, stroke, and Alzheimers.
In the pantheon of disease and death, lifestyle and genes have completely eclipsed germs. The great agent of social change today isnt cholera; its cholesterol. It propagates via drive-through windows, not sewers. Crowds dont flee the city when it strikes; they pay extra for a supersize serving. In the heyday of public health, public money went to clean up public filth. Today, were sick because we spend our private money buying bad health by the pack, the bottle, and the Happy Meal. Small wonder, then, that the germ-fighting social norms once ranked next to godliness still seem to many as antiquated as the whalebone hoops that defended Victorian virtue.
If we took the new microbes seriously, we could certainly beat them. The science for tracking, immunizing against, and annihilating germs grows more vigorous, innovative, and youthful with each passing year. But insurers and regulators now control how we use that science, and as the germ-phobic culture has decayed, theyve grown increasingly slow and rigid. Many competent people at the top echelons of government worry deeply about this problem. Yet as they scramble to address it, they must hack their way through laws and bureaucracies that have accumulated and thickened since the 1960s. Technical know-how isnt enough; collective will is also necessary. Yetparadoxical as it soundscollective will is what was lost as government took over the show.
Good sewers, public sanitation, and fresh water are undoubtedly public ends, best advanced by public means. Yet though germ killing begins in public, it must, as the Victorians grasped, end in private, and this is where the governments attempt to take charge of everything has had terrible consequences. The Victorians had nothing but culture to wield against germs on private premises, so they taught that clean was virtuous, and dirty sinful, and they taught it very persistently to everyone. But the big, efficient, technocratic government agencies of our day dont do virtue and sinthey requisition, stockpile, subsidize, proscribe, and mandate. And they teachimplicitly but persistentlythat germs are governments responsibility, not yours. Socialized germ killing makes it a lot easier for people to lose touch with the personal side of germicide.
When the government then tries to clean up human bodies with the same heavy hand that it uses to clean up the sewers, it can end up fighting both amnesiacs and those who remember too well. The forgetful push back because they are sick and tired of being hectored by the universal nanny about washing hands, vaccinating kids, and countless other time-wasting nuisances. The unforgetful seem to believe that the little routines and habits of daily life are too important to entrust to a nanny perched on the banks of the Potomac.
Consider, for example, the most important new vaccine recently licensed, to protect against the human papillomavirus (HPV). Developed by scientists at the National Cancer Institute, its designated a childhood vaccine, which will give it some (by no means complete) shelter from the tort lawyers. Merck licensed the vaccine, steered it through the FDA, and will be responsible if anything goes wrong. The firm is charging $300 or more for the three-shot doseten to 100 times the inflation-adjusted cost of most vaccines in the 1950s. It may well be worth it. This new kids vaccine protects against a sexually transmitted virus that causes many cases of cervical cancer. To be effective, however, vaccination must occur before exposure to the virus, and each new sexual partner exposes a girl to a 15 percent chance of infection. The Centers for Disease Control therefore plans to see to it that girls are vaccinated before. That, the federal authorities have concluded, would be before they are 12.
Quite a few parents have concluded that the federal authorities can go to hell. The amnesiacs are beyond help; theyre probably skipping other vaccines, too. As for the mnemonistsmaybe some just remember that sex spreads a lot of other germs, as well, and figure that they have a scheme to protect their little girls from all of them. Others probably arent making any conscious calculation about germs; theyre just holding fast to a faith and culture that still seeks to protect little girls from sex itself. People who believe government can achieve anything will say that it should just have handled this one more delicately. Perhapsbut the fact is, the governments germ killers have ended up at loggerheads with the people they most need as their closest allies: parents who teach the taboos and rules that provide a crucial line of defense against the most persistent and clever killers of children on the face of the planet.
Even when it doesnt reach the point of turning parents against vaccines, the government takeover has left many people with a triple sense of entitlementto germ-free life, risk-free drugs, and wallet-free insurance. This in turn has created an almost profit-free economic environment for germ-hunting pharmaceutical companies, which still do much of the basic science and take charge of the essential, delicate, and difficult business of mass drug production.
The new drug law that President Kennedy signed on October 10, 1962, codified a profound change in attitude. With infectious diseases all but finished, the drugs of the future would target human chemistry. A horrified world had just discovered that one such drug, which effectively relieved morning sickness and helped people fall asleep, also halted the growth of a babys limbs in the womb. Before, when microbes were the enemy, drugs got the benefit of the doubt. After the 1962 Thalidomide amendments, the unknown cure was officially more dangerous than the known disease. Very strong evidence would be necessary to establish otherwise.
The 1962 drug-law amendments gave decisive weight to human tests and clinical outcomes: a drug would not be deemed effective without clinical trials in which human patients started out sick and finished up healthy. Progressing in little steps from that seemingly sensible starting point, the FDA has since reached the point of worrying more about drugs evolving into sugar pills than about currently innocuous germs evolving into plagues.
The FDA has long required that clinical tests demonstrate that a new antibiotic is as good as or better than one already on the shelf, and it wants these trials to be extremely thorough and convincing. It worries that with too few patients tested, statistical anomalies might allow an inferior antibiotic to win its license, and the new one might then become the benchmark for a third even worse one, and so on until the industry slouches its way down to licensed sugar pills. The agency calls this scenario biocreep. Its just the sort of logical but overly theoretical concern that sneaks into government offices where the paramount objective is to avoid mistakes. But nowadays, no mistakes means no new drugs licensed until the germs start killing lots of people again. Most serious infectious diseases are rare, and its unethical to test a new drug on seriously ill patients when theres an old, licensed alternative in easy reach. This makes it all but impossible to assemble enough sick-but-not-too-sick patients for statistically meaningful clinical tests.
For example, the FDA recently had to decide whether to approve the use of Cubicin against a vicious germ that infects heart valves. Dubbed the Darth Vader bacterium, it is stealthy, difficult to kill, and almost always fatal if untreated. The FDAs staff wasnt convinced that enough patients had been tested to establish that the drug was better than the already existing alternativesand argued that if it was too weak, the germ might evolve into something beyond Darth, resistant to everything. On the other hand, doctors were already using Cubicin off-labelprescribing it in ways never officially vetted or approved by the drug company or the regulatorbut in low doses, thus possibly creating an even greater beyond-Darth risk. In the end, the chief of the FDAs anti-infectives division overruled a staff recommendation and granted the license. But every such decision is a fight, and all the legal, political, and institutional cards are now stacked against quick, bold action.
A letter coauthored by a doctor at Harvards medical school and published in Clinical Infectious Diseases in 2002 bluntly linked the FDAs approach to the end of antibiotics. For nearly two decades, it began, antibacterial research has been the Cinderella area in the pharmaceutical industry. Increasingly stringent demands for proof efficacy, framed in ways that seem innocent and technical, the authors argued, have thrown the industry into a panic. They have wreaked irreparable damage to our ability to provide a reliable pipeline of new antibiotics for treatment of serious infections. And they probably helped propel Lilly and Bristol-Myers Squibb out of antibacterial research and development.
Most fundamentally, the FDA has no mandatenone at all to prepare us for war against the next cholera. Its mandate is to make sure that we dont lose the battle against the next Thalidomide. Safe and effective, the two key standards set out in the 1962 drug law, have intelligible meaning only with a germ to fight and infected patients to fight it in. The agency, in other words, actually needs thriving germs to supply enough really sick patients to provide FDA-caliber evidence to validate the drug that will wipe out the germs. Epidemics that lurk in the future, waiting for germs still under construction, cant be officially considered at all. But those germsthe ones that dont yet exist, the ones still evolving, by chance or with human help in a terrorists labshould worry us most.
Todays public health guardians have made things even harder for vaccines than for antibiotics. Gene-splicing and other bioengineering tools make it far easier today than ever before to build safe corpses, cousins, or fragments of horrible microbes. But many vaccines that could quite easily be developed havent been, and probably wont be, because no drug company will take them to market.
Political support for the FDA depends on the public perception that its solving problems. When widely administered, as it must be, a vaccine wipes out the disease that it targets. The disease can thus be eclipsed in the publics mind by the vaccines side effects, however rare or even imaginary. The more effective the vaccine at the outset, the more likely it will be condemned as unsafe at the end. Moreover, the FDA has an acknowledged policy of being extra cautious in licensing any product that healthy people will use, especially children. Nothing could better suit the germs. Healthy children with undeveloped immune systems are their favorite targets.
Judges and juries have even more trouble balancing the interests of the individual who claims that a vaccine has injured him against those of the rest of the disease-free community. New legal standards formulated in the 1950s and 60s made it much easier to sue vaccine manufacturers, and relaxed rules of evidence soon made junk-science allegations much more common than legitimate ones. When liability claims spiraled to the point where they threatened to cut off the supply of some vaccines entirely, Congress set up an alternative compensation system for children (though not for adults) actually injured by their immunizations, and imposed a broad-based vaccine tax to fund it.
But it was too little, too late. Enveloped in bureaucracy, the germ-killing segment of the drug industry has lost much of its flexibility, resilience, and reserve capacityand has become painfully slow in developing what new science makes possible. Short-term economics and federal law have converged to create a systematic bias in favor of the germicidal drug invented and licensed decades ago. The principal fiscal concern is who should pay how much for the new, patented drug, or whether the old, cheaper generic might not do as well. The principal regulatory concern centers on the risks of the new drug, not the perils of the new germ. Insurers are cost averse: wherever they can, they favor cheap drugs with expired patents. The FDA is risk averse: when in doubt, it sticks with the old and says no to the new.
Research labs do continue to come up with new vaccines, but that decade-long process is now routinely followed by a second decade (at least) before a commercial product makes it to market. No one doubts that the extra time helps ensure that the vaccine eventually injected into millions of arms is safer and more effective than it might otherwise be. But the whole effort is long and costly, and the likely profits waiting at the end, many manufacturers have concluded, are very modest by comparison. In 1957, five important vaccines were being supplied by 26 companies. By 2004, just four companies were supplying 12. Mergers accounted for some of the attrition, but most of it resulted from companies getting out of the business. More than half of the vaccines routinely given to young children in 2005 came from just one manufacturer; four others had only two suppliers.
Having surrendered on all other social aspects of infectious disease, the health authorities now focus principally on socializing costs. Capping profits is the politically inevitable corollary. The federal government now buys over half of all vaccines used in the United States, and by taking on that role, it has effectively taken control of prices.
After the anthrax mail attacks in late 2001, federal authorities made it clear that if push came to shove, they would rescind patents, too. Just two years earlier, at the governments specific request, Bayer had asked the FDA to approve and label Cipro for use against inhalational anthrax. The agency granted the licensethe first ever for a drug for use in responding to a deliberate biological attackin a matter of months, basing its approval not on human trials but on the antibiotics effectiveness in inhibiting anthrax in rhesus monkeys. Then came the 2001 mail terror. Demand for Cipro soaredand prices collapsed. Yes, collapsed.
It was politically unthinkable for Bayer to raise the retail price in pharmacies, and federal authorities immediately demanded huge discounts on the pills that they wanted to stockpile. The Canadian government initiated its price negotiations by announcing that it would ignore Bayers Canadian patent and order a million tablets of a generic version of the drug from another company.
A couple of years later, Congress passed the 2004 BioShield law. It is intended to create a federal stockpile of bioterror vaccines, and to that end, it empowers the Pentagon to bypass certain aspects of the 1962 drug laws. Those provisions have already been invoked once, to cut off litigation against military use of the anthrax vaccine. The federal government also offered almost $1 billion of BioShield cash for the development of a new anthrax vaccine and the provision of 75 million doses. But established drug companies just werent interested. The contract went to VaxGen, a tiny startup that had never brought a licensed drug to market and that proposed to supply a bioengineered vaccine that the army had already developed. VaxGen failed to deliver, and last December, the government canceled the contract.
A decades-old alternative with various problems continues to be provided by BioPort, whose declared mission is to develop products for use against infectious diseases with significant unmet or underserved medical needs and, most notably, potential weapons of bioterrorism. That would mean anthrax, botulism, Ebola, and smallpox, among other killers. BioPort employs about 450 people.
So what would a watchmaker have donenot a blind one, but one with keen eyes and an excellent loupeif called upon to design a microbe that could thrive among people fortified by fistfuls of vaccines and backed up by dozens of potent antibiotics? Nature got there, without the loupe.
Humans had spent a painstaking century developing vaccines. So nature designed an immunodeficiency virusan all-purpose anti-vaccine, so tiny, quiet, slow, methodical, and gentle that it spread unnoticed for decades, and so innocuous that it never quite gets around to killing you at all. It leaves that to the old guardthe bacteria, protozoa, and viruses that invade when your immune system shuts down, and feast on your brain, lungs, blood, liver, heart, bone marrow, guts, skin, and the surface of your eyes. In its final stages, AIDS is truly horrible.
When the blind watchmaker has been pulling such stunts for 4 billion years, its reckless to suppose that HIV was its last or worst. Quite the contrary: our casual willingness to tolerate a septic underclass, so long as it remains insular and out of sight, is certain to hasten the rise of much more and much worse. People as negligent with pills as they are with germs have already helped spawn drug-resistant forms of tuberculosis, by taking enough medicine to kill weaker strains, while leaving hardy mutants alive to take over the business. HIV patients who dont strictly follow the complex, unpleasant drug regimen used to suppress the virus become human petri dishes, in which microbes multiply and evolve to resist the stew of antibiotics prescribed as a last resort.
The new infectious diseases are already very good at this sort of adaptationthey have evolved to be as nimble as we are now institutionally stolid, as flexible as we are rigid. The influenza virus evolves exceptionally fast, using pigs as its genetic mixing bowl. HIV mutates constantlyone drug-resistant strain of the virus now apparently depends for its survival on a chemical constituent of the drug widely prescribed to stop its advance. Stubbornly persistent pelvic infections are on the rise, along with drug-resistant gonorrhea and even syphilis. A certain staph bacterium responsible for the most common infection acquired in hospitals has developed ways to pass the gene that produces a lethal toxin from one strain to the next and also to certain viruses that can spread it further still.
Its because the threat is so grave that one must avoid the temptation to propose a simpleminded checklist of reforms to shoehorn somewhere into the middle of the next 1,000-page revision of the federal drug laws and FDA regulations. Germs are terrorists: they let the dead past bury its dead, they are always changing, and the ones you know arent the ones that will kill you. If we somehow revive a tough, germ-fearing culture, the risk-averse drug regulators, penny-pinching insurers, overreaching judges, clueless juries, and preening, industry-bashing congressional committees will fall into line. If we dont, no tinkering will make much difference.
What we need is quite simple. We need many people to be much more frightened than they currently are. And we need a robust, flexible, innovative portfolio of drug companies to sink a lot of new capital into highly speculative ventures, almost all of which will lose money, with just one or two ending up embraced by regulators, eagerly paid for by insurers, vindicated every time by judges and juries, lauded by nonconformist preachers, and so spectacularly profitable for investors that they crowd in to fund more.
If we cant drum up concern by other means, some dreadful germ will materialize and do the job for us. Nobody knows which one; thats why we so desperately need the right popular culture and vigorous private enterprise. If the germs in the tattoo parlors today were both virulent and untreatable with current medicine, you wouldnt be reading this, at least not in the heart of any big city. Youd be heading for the country.
Thats what the rich did when epidemics struck in Dickenss day. They knew what they were fleeingthe urban pathologies described in Our Mutual Friend in 1864 were as familiar to Londoners as the Thames. And familiar not just to the boatmen who made a living fishing human corpses out of the river but also to the middle class, decimated by a violent cholera outbreak in Soho at the end of August 1854; to the entrepreneurs who made fortunes collecting and sorting mountains of trash; to members of Parliament, who, in June 1858, had to evacuate the House of Commons to escape the pestilential stench of the river; and to Queen Victoria, who lost her husband to another waterborne disease, typhoid fever, in 1861. Small wonder that cholera was a great agency for social change. In the time of cholera, the bacterium itself loved everyone.
Anthrax prefers goats; it finds its way into human bodies only very occasionally, through open wounds. The spores can be inhaled, too, but ordinarily they clump together and dont spread well through the air. They become mass killers only when people painstakingly coat them with other materials and take special efforts to disperse them. The spores that struck 11 Americans (and killed five) in Washington, D.C., and New York in late 2001 werent dispersed through the Potomac or Hudson Rivers; they arrived by U.S. mail. A few pounds, suitably prepared and dispersed in the New York subway, could kill 100,000 people. If cholera is a social disease, weaponized anthrax defines the antisocial bottom of contagionits a microbe that infects humans only with the help of sociopaths.
But we live in an age of sociopaths, and there remains much that we dont know about germs. Viruses and prions may play a far larger role in genetic malfunctions than we yet fully understand. HIV and influenza demonstrate the boundless viral capacity to mutate and evolve. And while anthrax could never make it on its own in New York, murderous people are scheming to give it help. One way or another, germs will contrive to horrify us again, in some very nasty way. A societys only real defense is to stay horrified, well ahead of the curve.