Changing Medical Practices in Early America

Sitemap

This paper is very peripherally related to West Boylston. I found reading "The History of West Boylston" (published by the West Boylston Historical Society), to be extremely useful additional reading when I took a history of medicine and health care class. If you enjoy history as much as I do, you may be interested in finding out what medical treatment was like in the "olden days."

 

 

When North America was initially settled by the Europeans, the early settlers found that the promise of life in America often did not match the "advertising" as "the land of milk and honey." The early settlements were often ravaged by starvation and disease. Over 80% of the Virginia colonists died in the first years, and nearly 50% of the first Massachusetts colonists died over the first winter.[1] As many Europeans settled in North America, their settlements were quite spread out, roads were often poor and people were extremely isolated. Since America lacked long-settled cities, universities, formal medical training and hospitals were essentially unknown.

During the colonial era, doctors' education was informal. Most were literate, but some, particularly those raised outside of New England where primary schools were a part of almost every town, were not. A man who wished to practice medicine did not need any type of certification. Most did have a period of apprenticeship with an established physician, but even this was not a requirement. Up until the late 19th century, very few doctors had a college education, though a few young men with wealthy parents did study medicine at the University of Edinburgh, University of London or University of Padua.[2] Medical education, while never very good in colonial days, actually declined in the early federal period due to a lack of teachers, inadequate clinical facilities and a lack of cadavers for dissection. Medical education declined to the point that only seven medical schools were open in the United States in 1813.[3]

Medical facilities were informal. Most patients were treated in their homes. However, even the smallest towns had poorhouses, where destitute people could live and receive limited medical care:

"...the town expended $848.56 on the support of the poor plus $335.87 for the care of paupers 'at Hospital.'...The unfortunate who needed town aid...(had)...his name printed in the town report for all to see...If a pauper died, his funeral expenses were borne by the town."[4]

The condition of these poorhouses was often squalid, and it was difficult for towns to attract doctors to work there. Frequently, the poor, medically untrained residents would care for one another since there were no other options.

The few hospitals that opened in North America during the colonial period were opened in places like Quebec and New Orleans both cities dominated by the French. Finally, Benjamin Franklin and Dr. Thomas Bond raised money to open the Pennsylvania Hospital, the first hospital that was not also a poorhouse in America. This hospital did not permit people with infectious diseases in unless they were quarantined to special rooms and not housed in the wards.[5]

Public health was virtually unknown in North America at this time. Towns and cities did not have boards of health except during times of epidemics (or threatened epidemics). Most Americans got their water from pumps and used outhouses until well into the 19th century because most places did not have public water or sewer systems. There was no trash collection so the streets became a breeding ground for all types of disease.

There were a few attempts to influence public health, but most of these were only local efforts. For example, when smallpox vaccinations were developed in the 18th century, many small town doctors, particularly in New England, ran "smallpox resorts" where groups of people were variolated and had to stay quarantined for a few days to make sure they only developed a mild case of smallpox. However, since New Englanders were used to the concept of inoculations, when the improved smallpox vaccination was introduced in the last 1790s, it had widespread acceptance.[6] The state and federal governments did not get involved in these attempts to improve public health; these were generally managed locally.

The colonial and early federal periods marked the height of "heroic medicine," where purgings, bleedings, and high doses of toxic drugs like calomel constituted treatment for almost every condition. Since many diseases are self-limited, the "cures" may have killed more people than the diseases themselves. Between heroic medicine and a geographically very diverse population that demanded a high level of self-reliance, the public developed a very skeptical attitude towards regular doctors. In the early 19th century, the spirit of Jacksonian democracy was common across America, which further heightened the "do it yourself" attitude of many Americans. Irregular medical sects were popular worldwide in the 19th century, but they were particularly common in the United States. These sects, while they freely gave medical advice, emphasized the participant of the patient in his or her own treatment.

One popular irregular medical sect, particularly in the rural areas, was the Friendly Botanical Society. Started by Samuel Thomson, a New Hampshire writer who wrote The New Guide to Health: Botanic Family Physician, it popularized taking herbs and drinking lots of herbal teas and wines. While Thomsonian medicine still preached the use of emetics (lobelia), it was strongly opposed to both bleedings and calomel. In some ways, the Friendly Botanical Society was like the Amway of its day as it stressed the use of door-to-door sales people to peddle its books and herbal remedies.

Another reaction against heroic medicine was homeopathy. It was started by a university-trained German doctor named Samuel Hahnemann. Hahnemann said that doctors were giving their patients too much medicine. He believed that tiny amounts of drugs should be diluted in water before being given to a patient and that practitioners should take very thorough medical histories of each patient. Homeopaths welcomed women as physicians at a time when women were not permitted to practice regular medicine.

Andrew Taylor Still started the practice of osteopathy. Osteopathy incorporated bodily manipulations, similar to those seen in modern chiropracty. In osteopathy, these manipulations effected the magnetic flow of energy in the body. Osteopathy discouraged use of medicines, but did not forbid them. While most of the other irregular sects died out over time, doctors of osteopathy are still granted and often are trained in tandem with regular doctors.[7]However, some aspects of osteopathy sounded like quackery, particularly Still's experiments with hypnotism and the use of magnets to influence the blood flow in the body.

Quackery was basically a way to fool people into believing they were being cured while making money from them. Quackery had even been licensed in London, but it was completely ignored by the America government for hundreds of years. While some quackery did come from otherwise eminent physicians (Dr. William Hammond, one of the first neurologists in America, developed the theory of isopathy, in which animal extracts were used to treat a number of diseases from impotence to a weak heart.[8]), most of it was created by hucksters. Quackery could be deadly, since there was no regulation of what patent medicine could contain. Once the American Medical Association got started, it went after quacks and fought with the government over making the sale of quack medicine illegal. Eventually, with the banning of narcotics for non-prescription drugs, the impact of quack medicine was lessened. While quack medicines, particularly for "weight loss and stamina," are still commonly available in the United States, they must be safe (even if they are not always genuinely effective). People respond to quacks now for the same reason that they always have; as P. T. Barnum said "There's a sucker born every minute." When a person is sick and has no alternative, he will jump at the first thing that looks like it might work (laetrile, cabbage soup, gallons of juice a day). The use of quack drugs for serious diseases appears to be on the wane, but they are as popular as ever for "lifestyle" issues.

Health fads are tougher to characterize. Like the irregular sects, health fads tend to be indulged in by people who want to treat themselves. Some health fads (moderate jogging, vegetarianism, hydropathy) are not generally dangerous and have at least some health benefit. During the 19th century as a literate middle class blossomed in the United States, so did the number of faddish practices, particularly involving the diet. Vegetables, graham crackers, and cereals were all centerpieces of 19th century health fads.

Hydropathy was something of a "special case." It really was not an irregular medical sect the way that the Friendly Herbal Society or osteopathy was; while called quackery by many, it really was not. But more than the "healing power of water," hydropathy indicated the value of the rest cure, importance of having like-minded people around, the usefulness of light exercise and the fact that women who wore loose-fitting clothing generally felt better and had fewer physical complaints than the ones who did. So even if the water itself did not have special curative powers, the fact that middle-class people (women in particular) were allowed to get away from their normal routines and come home rested, made a positive impact on some parts of society.[9]

Another "special case" is the entire issue of faith-only healing. While empirical evidence in favor of faith-only healing is lacking, anecdotal evidence suggests that some people who pray do experience spontaneous remission of certain disease. However, some people who do not pray also experience similar spontaneous remission. Like hydropathy, faith healing's benefit may be to point out other important issue in regaining one's health -- the power of a positive attitude. Faith-only healing seems to be experiencing something of a resurgence, as a number of recent court cases have borne out. While medicine is hardly a perfect system, most people do acknowledge that it is better than doing nothing or limiting one's treatment to prayer.

The late 19th century marked the incorporation of major changes in medicine around the world, but particularly in the United States. Between 1840-1900, medicine went from being a medieval art to incorporating many elements of modern science. The advances in chemistry, and biology had major impacts on medicine. As medical practitioners began to understand that the body was comprised of basic chemicals and not mysterious humors, effective treatments for diseases and injuries were developed. Purgings and bleedings went out of vogue.

But as medicine became more scientific, it was clear that doctors needed both training and licensing. In 1847, Dr. Nathan Davis founded the American Medical Association (AMA) in Philadelphia to help create professional standards for doctors and set minimal educational requirements.[10] Medical colleges opened up across the country, gradually expanding their requirements from a few months without any college background to a number of years with a college degree a prerequisite. However, these colleges provided an extremely erratic level of medical education, with some of them being little more than diploma mills while others provided a high-caliber medical education. The erratic level of medical education in America continued to be a problem well into the 20th century when the Flexner Report made many suggestions for the improvement of medical education, most of which were implemented.

As American cities exploded in size during the 19th century due to the massive immigrant migration from Europe, public health, particularly in the cities, became more of an issue. With many hundreds of thousands of people living in extremely crowded, unsanitary conditions, tuberculosis was often at epidemic levels in the cities. The Shattuck Report on public health was released before the Civil War and encouraged city governments to start cleaning up their acts, but the cities remained filthy until very late in the 19th century.

During the 19th century, people started to understand that TB was not caused by miasma, but was caused by bacteria. People with TB were sometimes sent out of the city to sanitariums in the country, where the cleaner air seemed to help their recovery. The problem of cleaning the air pollution, particularly in the cities, was not really dealt with in an effective manner until the mid-20th century. As the importance of having clean water for drinking, bathing, and waste removal was understood, cities undertook massive sewer projects to help bring clean water into the cities while removing waste water from the city. Dead animal carcasses and garbage littered the street until the late 1800s when cities started sanitation crews to take the trash out of town and dumps to move the waste to. Many cities started walk-in dispensaries so the poor could receive treatment and medications for little or no cost.

With the urban population explosion, poorhouses became even larger and harder to manage. With new medical advances, people needed to be in a special setting to receive certain types of treatment. Americans began to build hospitals across the country in the 19th century. The new hospitals were generally cleaner than the old poorhouses, since they were built at a time that people started to understand the importance of cleanliness to health.

In the 20th century, the pace of technological change has made medicine an extremely expensive enterprise. Medical costs were about 14% of the GNP in 1991. Medical education is extremely expensive, often costing medical students over $200,000 to pay for their training. Taking on an enormous burden of debt at the start of their professional lives seems to make new doctors choose the more lucrative specialties of medicine (surgery, neurology, plastic surgery) over the necessary but less lucrative specialties (internal medicine, pediatrics, rural general practice).

Medical schools are in a real quandary, as buying the latest equipment and paying physician/professor salaries amounts to much more than medical tuition can cover. They are constantly looking for more funding from pharmaceutical and medical equipment companies to pay for studies that could be performed by the medical school. But since the physician/professors are required to go after grant and clinical practice money, that leaves them with less time to teach students, so as a result the educational experience for many medical students may be suffering.[11] As Kenneth Ludmerer pointed out "...the greatest threat to medical education had been the over-emphasis on medical research -- now it had become the over-emphasis on generating clinical practice revenues"[12]

The 20th century has seen continued improvements in public health, with some exceptions. Enforcement of federal pollution and local sanitary laws have helped to keep the air and water cleaner. Children are required to have inoculations against common diseases such as mumps, measles and whooping cough. The WIC program helps to supplement the diets of many poor women and children. However, medical care for the working poor and mentally incapacitated are not currently viewed as governmental responsibilities. The working poor are often forced to use hospital emergency rooms as primary medical care. With the closing of many mental hospitals, the mentally ill are often homeless, in jail, or are not receiving any treatment for their conditions. These are areas in which the United States should try to improve its public health care delivery.

Medical facilities improved as the federal government started giving money to build modern facilities after World War II. Additionally, the introduction of the Medicare and Medicaid programs to pay for treatment of the old and the poor brought an enormous influx of capital into the hospitals between 1965 and 1990. These monies, along with insurance payments for insured patients and the donations many charities make to hospitals permitted hospitals to go on a spending spree that lasted until the early '90s. Since then, the federal government has been paying less money to hospitals for Medicare and Medicaid in an effort to help balance the federal budget. Some smaller hospitals are closing or merging with other hospitals, and some inner city hospitals are having trouble meeting all their financial obligations.

In conclusion, a number of both positive and negative forces will continue to impact medical education, public health and medical facilities. Improvements in technology may continue to make cost containment difficult, or it could create a series of cheaper treatments.