What Are Antimicrobials?
The antibiotic given you by a doctor falls into a class of medicines called antimicrobials. These come under the general heading “chemotherapy,” which refers to the treatment of disease with chemicals. While the term “chemotherapy” is often used in connection with treating cancer, it originally applied—and still does—to the treatment of infectious diseases. In such cases it is called antimicrobial chemotherapy.
Microbes, or microorganisms, are tiny organisms that can be seen only with the help of a microscope. Antimicrobials are chemicals that act against microbes that cause illness. Unfortunately, antimicrobials can also act against microbes that are beneficial.
In 1941, Selman Waksman, codiscoverer of streptomycin, applied the term “antibiotic” to antibacterials that come from microorganisms. Antibiotics as well as other antimicrobials used in medical treatment are valuable because of what is called selective toxicity. This means that they can poison germs without seriously poisoning you.
Actually, however, all antibiotics are at least somewhat poisonous to us too. The margin of safety between the dosage that will affect the germs and the dosage that will harm us is called the therapeutic index. The larger the index, the safer the drug; the smaller, the more dangerous. In fact, thousands of antibiotic substances have been found, but most are not useful in medicine because of being too toxic to people or to animals.
The first natural antibiotic that could be used internally was penicillin, which came from a mold called Penicillium notatum. Penicillin was employed intravenously for the first time in 1941. Shortly thereafter, in 1943, streptomycin was isolated from Streptomyces griseus, a soil bacteria. In time, scores of additional antibiotics were developed, both those that are derived from living things and those that are made synthetically. Yet, bacteria have developed ways of resisting many of these antibiotics, causing a global medical problem.
The penicillin mold colony seen at the bottom of the dish inhibits the growth of the bacteria
When they first appeared, antibiotics seemed like wonder drugs. Hitherto incurable infections caused by bacteria, fungi, or other microorganisms could now be treated successfully. Thanks to the new drugs, deaths from meningitis, pneumonia, and scarlet fever declined dramatically. Hospital infections that had formerly meant a death sentence were cleared up in a few days.
Since the time of Fleming, researchers have developed dozens of additional antibiotics, and the search for new ones continues. During the last 60 years, antibiotics have become an indispensable weapon in the fight against disease. If George Washington were alive today, doctors would doubtless treat his sore throat with an antibiotic, and he probably would recover in a week or so. Antibiotics have helped practically all of us shrug off one infection or another. However, it has become apparent that antibiotics do have some drawbacks.
Antibiotic treatment does not work for diseases caused by viruses, such as AIDS or influenza. Furthermore, some people have an allergic reaction to certain antibiotics. And broad-spectrum antibiotics may kill off the helpful microorganisms in our bodies. But perhaps the greatest problem with antibiotics is their overuse or underuse.
Underuse occurs when patients do not complete the prescribed antibiotic treatment, either because they feel better or because the treatment is lengthy. As a result, the antibiotic may not kill off all the invading bacteria, allowing resistant strains to survive and multiply. This has frequently happened in the case of treatment for tuberculosis.
Both doctors and farmers have been guilty of overuse of these new drugs. “Antibiotics have often been overprescribed in the United States, and they are used even more indiscriminately in many other countries,” explains the book Man and Microbes. “They have been fed in huge quantities to livestock, not to cure disease but to aid growth; this is a major reason for heightened microbial resistance.” The result, the book warns, is that “we may be running out of new antibiotics.”
But apart from these concerns about antibiotic resistance, the second half of the 20th century was a time of medical triumphs. Medical researchers seemed capable of finding drugs to fight practically any malady. And vaccines even offered the prospect of preventing disease.
“Immunization is the greatest public health success story in history,” stated The World Health Report 1999. Millions of lives have already been saved, thanks to massive worldwide vaccination campaigns. A global immunization program has eliminated smallpox—the lethal disease that claimed more lives than all the wars of the 20th century combined—and a similar campaign has almost eradicated polio. (See the box “Triumphs Over Smallpox and Polio.”) Many children are now vaccinated to protect them against common life-threatening diseases.
Other diseases have been tamed by less-dramatic methods. Such waterborne infections as cholera rarely cause problems where there is adequate sanitation and a safe water supply. In many lands increased access to doctors and hospital care means that most diseases can be identified and treated before they become lethal. Better diet and living conditions, along with enforcement of laws regarding proper handling and storage of food, have also contributed to improving public health.
Once scientists tracked down the causes of infectious diseases, health authorities could take practical steps to halt an epidemic in its tracks. Consider just one example. An outbreak of bubonic plague in San Francisco in 1907 killed few people because the city immediately launched a campaign to exterminate the rats whose fleas transmitted the disease. On the other hand, starting in 1896, the same disease had caused ten million deaths in India within 12 years because its underlying cause had not yet been identified.
At the end of October 1977, the World Health Organization (WHO) tracked down the last-known naturally occurring case of smallpox. Ali Maow Maalin, a hospital cook who lived in Somalia, did not get a severe attack of the disease, and he was well again within a few weeks. All people in contact with him were vaccinated.
For two long years, the doctors waited anxiously. A $1,000 reward was offered to anyone who could report another confirmed “active smallpox case.” Nobody successfully claimed the reward, and on May 8, 1980, WHO formally announced that “the World and all its peoples have won freedom from smallpox.” Just a decade earlier, smallpox was causing the death of about two million people a year. For the first time in history, a major infectious disease had been eliminated.
Polio, or poliomyelitis, a debilitating childhood disease, offered the prospect of similar success. In 1955, Jonas Salk produced an effective vaccine for polio, and an immunization campaign against polio began in the United States and other countries. Later an oral vaccine was developed. In 1988, WHO launched a worldwide program to eliminate polio.
“When we began the eradication effort in 1988, polio paralysed more than 1000 children each day,” reports Dr. Gro Harlem Brundtland, then director general of WHO. “In 2001, there were far fewer than 1000 cases for the entire year.” Polio is now confined to fewer than ten countries, although more funds will be needed to help these lands finally eliminate the disease.
(Smallpox was the ideal disease to combat by an international vaccination campaign because, unlike diseases that are spread by troublesome vectors such as rats and insects, the smallpox virus depends on a human host for its survival.)
In the next article we will discuss the dangers associated with antiobiotics.