U.S. Army Medical Department, Office of Medical History
Skip Navigation, go to content







AMEDD MEDAL OF HONOR RECIPIENTS External Link, Opens in New Window






Chapter 1

Table of Contents

Chapter 1

The State of the Art

The colonial physicians who formed the American Army's Medical Department in 1775 were all civilian practitioners, many without any military experience. A small percentage had earned M.D. degrees, but most were either apprentice or self-trained, and few made any attempt to specialize in the manner customary in Europe, where a choice was usually made among medicine, surgery, and pharmacy. During the Second half of the eighteenth century, however, American doctors were growing in stature at home and abroad. Although more of them were receiving a formal medical education, usually in Europe, they were still limited by the general lack of scientific data and by their profession's predilection for reasoning rather than research as a way of discovering better forms of treatment for their patients. The traditional humoral explanation for disease was by this time losing ground to several new and conflicting systems, where fact took second place to theory, in an all-out attempt to reveal one or two basic causes for all disease. Disagreements over therapy gave added intensity to the feuds and controversies which characterized eighteenth century practice, in general, and American medicine, which was not restrained by European guild traditions, in particular. Furthermore, the effort to develop a fundamental theory deemphasized the importance of the diagnosis of specific diseases. Treatment continued to consist largely of bleeding, purging, and blistering, regardless of the symptoms, since surgery alone was based to any significant degree upon experience as a guide in preference to theory.1


The system prevailing in the colonies in the years immediately preceding the American Revolution, that of the great Dutch physician and teacher Hermann Boerhaave, explained disease in terms of chemical and physical qualities, such as acidity and alkalinity, or tension and relaxation, instead of the blood, phlegm, yellow bile, and black bile of the traditional humors, and urged that nature be permitted to aid in any cure. The Boerhaavian system was increasingly challenged in the second half of the century by that of William Cullen, the Scottish physician and teacher so much admired by

Hermann Boerhaave. (Courtesy of National Library of Medicine.)


William Cullen. (Courtesy of National Library of Medicine.)

Americans studying under him at the University of Edinburgh, many of whom would later be leaders in the Continental Army's Hospital Department. Cullen believed that either an excess or an insufficiency of nervous tension underlaid all disease. Too much tension was often characterized by a fever, to be treated by a depleting regimen including bleeding, a restricted diet, purging, and rest and sedation. A cold or chill, on the other hand, indicated too much relaxation and called for restorative measures. In time, Cullen became so influential that Benjamin Rush, just gaining prominence in American medicine during the Revolution, was able to write his former teacher that the American edition of his work "was read with peculiar attention by the physicians and surgeons of our army, and in a few years regulated in many things the practice in our hospitals."2

Despite his doctrine that disordered nervous tension was the cause of all disease, Cullen encouraged the study and classification of specific diseases. Rush, however, eventually modified Cullen's doctrines, which he had originally so much admired, and discouraged the study of separate disease entities by blaming all disease on excessive tension which caused disturbance in the blood vessels. By 1793, he was openly contending that there was but one single disease in existence. The method of treatment upon which Rush insisted with increasing inflexibility called for a low diet, vigorous purges with calomel and jalap, and bleeding until the patient fainted. Rush apparently did not hesitate to remove a quart of blood at a time, or, should unfavorable symptoms continue, to repeat such a bleeding two or three times within a two- to three-day period, it being permissible in his opinion to drain as much as four-fifths of the body's total blood supply. In time, Rush's system and treatment became, in the words of a noted medical historian and physician, "the most popular and also the most dangerous 'system' in America."3

Sir John Pringle. (Courtesy of Library of Congress.)


Theorists of the eighteenth century did not generally include in their systems an explanation for the outbreak of certain forms of disease among many people in one area within a short period of time. According to his background, training, and experience, therefore, a physician might blame mass outbreaks of disease on climate and season, unhealthy elements in the air, contagion, possibly caused by "animalcules," or God's determination to punish sinful man. There were many discussions of possible sources of disease carried by the air. Rush strongly believed in the danger of bad odors, or miasmas, and Sir John Pringle, a noted British Army surgeon and a Physician General in the British Army from 1745 to 1758, much respected by the many Americans who knew him in their student days, wrote that putrefaction was the greatest cause of fatal illness in armies. He listed corrupted marsh water, human excrement remaining exposed in the hot weather, crowded military hospitals, and straw used for bedding rotting in tents as the four principal sources for putrid air. Although he recognized cold as a predisposing factor in disease, Pringle noted that heat, too, was often a cause of sickness, especially when wet clothes and beds or a very humid atmosphere tended to interfere with normal perspiration, "relaxing the fibers and disposing . . . to putrefaction." He found it "not surprizing [sic] that the dysentery and bilious fever, both putrid diseases, should ensue." The theory of the atmosphere as a cause of many types of fevers was still maintained as late as 1812. David Hosack, a respected American physician, pointed out at that time, however, that disease might also be spread by direct physical contact, as in syphilis and scabies, or through the purest air, as with smallpox. Many authorities at that time also blamed sudden changes in the weather for causing outbreaks of disease.4

The average eighteenth century physician had little in the way of either equipment or understanding to aid him in distinguishing one specific disease from another. The concept of a standard body temperature had only been suggested, the body's heat-regulating mechanism was not understood, and Fahrenheit's recently developed mercury thermometer was not commonly used by physicians. The stethoscope was not invented until 1814, and although a "pulse watch" had been developed in 1707, it also was largely ignored by physicians, who preferred describing the pulse to counting it. The use of percussion to aid in diagnosis, however, was beginning to become more widely understood because of the work of the Viennese physician Leopold Auenbrugger. The reasoning underlying some eighteenth century diagnoses, however, may seem strange to us today. When a fever described as yellow fever responded to quinine, for example, rather than concluding that the fever was in reality malaria, the eighteenth century physician assumed that quinine must be effective against yellow fever. Since all fevers were regarded as stemming from the same physical unbalance, such a conclusion was logical. Rashes were not regarded as particularly significant in diagnosis, and differing symptoms appearing in patients believed to have the same disease might be brushed off as indicative merely of the conditions under which the illness was contracted.5

In the eighteenth century and, as far as the U.S. Army was concerned, until World War I, disease invariably caused more deaths than wounds. It has been estimated that, during the American Revolution, 90 percent of the deaths occurring among the inexperienced, poorly clothed, poorly fed soldiers of the Continental Army, most of them country boys without previous exposure to communicable diseases, and 84 percent of those among the seasoned, disciplined British regulars were from disease. Under the circumstances, however, it is difficult today to determine from the diag-


noses and descriptions of eighteenth century physicians what specific diseases were most common in the army of that period. Respiratory illnesses were most often seen in cold weather and dysenterylike conditions in hot weather, while fevers were always a threat. Venereal disease was common, and smallpox could wreak havoc in the ranks of American armies. Scurvy was a danger on land as on sea, and scabies, otherwise known as the Itch, was a more than ordinary nuisance for military forces. Other diseases, such as diphtheria and scarlet fever, were less common in eighteenth century armies despite their occasionally devastating effects upon the civilian population.6

Whatever the eighteenth century diagnoses were, eighteenth century fevers were often in fact malaria, widespread in the colonies and endemic from New England southward. Yellow fever, despite its fearful reputation, was endemic only in the deep South, although rare outbreaks occurred during the summer in a few ports north of Charleston, South Carolina. The incidence of malaria was rising during the Revolution, especially in the South with its long hot summers and undrained swamps, affecting with particular severity, for example, not only Cornwallis's men but also New Englanders participating in the fighting around Yorktown. Although mosquitoes were rarely suspected as carriers of disease, eighteenth century physicians were aware of the relationship of fevers to swamps and undrained areas.7

While "intermittent" and "remittent" fevers were probably malaria, those called putrid, malignant, jail, or hospital may have been either typhus or typhoid. Authorities do not agree on the extent to which late eighteenth century physicians could differentiate between these two diseases.8

An examination of the writings of the period shows, however, that when British military physician and author Richard Brocklesby made his diagnoses, he did not take into consideration the petechiae which modern physicians believe to be the key to a definite differentiation between the two diseases in the absence of laboratory tests. British surgeon Donald Monro, furthermore, believed that the symptoms manifested by a malignant fever depended upon the conditions prevailing at the time the disease was contracted, and the prominent Austrian military surgeon Gerhard van Swieten considered the appearance of a rash to be an indication of a favorable outcome rather than one of the nature of the disease. A modem authority on the epidemic diseases of pre-Revolutionary America also points out that "Not only was typhus rarely found in the colonial period but even after the Revolution the United States remained relatively free of the infection." He ranks typhus in importance after malaria, dysentery, and typhoid.9

Eighteenth century soldiers "often exposed to the putrid Steams of dead Horses, of the Privies, and of other corrupted

Richard Brocklesby. (Courtesy of National Library of Medicine.)


Gerhard van Swieten. (Courtesy of National Library of Medicine.)

Animal or Vegetable substances, after their juices had been highly exalted by the Heat of Summer"10 sometimes found themselves afflicted with "A flux of the belly, attended with violent gripings, of very painful strainings for stool."11 This was blamed on "obstructed Perspiration,"12 or on "bile grown acrid by the great heats and the fatigue of war" when "the soldier, when hot, suddenly exposes himself to cold air, or sleeps in his cloaths [sic], soaked with rain" as well as on stagnant water and tainted food. Van Swieten, whose work was influential in America, believed that dysentery could be spread through an entire army by the breath of those afflicted with the disease and that ordinary diarrhea could degenerate into dysentery if not promptly treated with a mild purge followed by a dose of opium. In the eighteenth century, however, even dysentery itself often remained untreated. The soldier unlucky enough to be so afflicted might, indeed, prefer neglect to the very generous doses of purgatives and emetics required by eighteenth century doctrine, or to the blister which would adorn his abdomen should the physician determine that this portion of his anatomy was too tense. It was only after he had survived these remedies that the patient could hope for a dose of opium, which might at least temporarily relieve his agony even if his disease had been misdiagnosed, a distinct possibility, and was actually typhoid or typhus.13

Venereal disease was another ever-present threat to the eighteenth century army but one not always reported because of the punishment often administered to its victims. Some modern historians believe that eighteenth century physicians could distinguish between gonorrhea and syphilis, but examination of a number of publications of the time suggests that the distinction was partial at best. The British John Hunter, one of the century's finest surgeons and anatomists, stated that the two were but "different forms of the same disease," gonorrhea

John Hunter. (Courtesy of National Library of Medicine.)


being the form in which the urethra alone was affected and chancre a nonsecreting version of the same disease. Hunter could state this confidently even while he noted with surprise that mercury, which cured chancre, only made gonorrhea worse.14 Brocklesby, too, considered gonorrhea but one symptom of "lues venera," as did van Swieten and at least one physician as late as 1815.15

Although it may not have been fatal, scabies brought more patients to British Army hospitals during the Seven Years' War than any other condition, according to British Army surgeon Donald Monro. Its cause was known to be "Little Insects Lodged in the Skin, which Many Authors Affirm They Have Seen in the Pustules by the Help of a Microscope."16 The Itch was very common in colonial America, Benjamin Franklin's mother-in-law having advertised a remedy for it in 1731. The condition usually appeared first between the fingers in the form of a "pustule, or two, full of a sort of clear water, which itch extremely: where these pustules are broke by scratching, the water that issues out communicates the disorder to the neighboring parts . . . in its progress the pustules augment both in number and size, and when opened by scratching a disgusting crust is formed." The favorite remedy for the Itch seems to have been sulfur, a remedy still in use today, applied in either an ointment or a soft soap.17

Among the respiratory ailments often afflicting armies in the eighteenth century were pneumonia, often called peripneumonia, and pleurisy. Van Swieten suggested treating the former, as soon as diagnosed, by a "large bleeding in the arm" but also urged that the air be kept moist and the patient encouraged to bring up the secretions from his lungs. If the progress of the disease weakened the patient, however, this authority recommended that bleeding, purging, and sweating be avoided. The soldier unfortunate enough to be afflicted with so severe a pleurisy that the pain interfered with his breathing would be treated not only to bleeding but also to blistering, clystering, and, to encourage the pus to drain outward and thus avoid the formation of an abscess, plastering. If the patient could not sleep, he might be dosed with a syrup of white poppies. To soothe his cough, his medicine would be administered while lukewarm, and wine, salt, and acid foods would be denied him.18

Specific types of therapy for specific diseases were understandably not too common in the eighteenth century and certain remedies were used for almost all diseased conditions. Bleeding was popular, the amount and frequency varying with the individual physician and the system he followed. A moderate bleeding was considered to be one taking 8 to 12 ounces at a time, a heavy one 16 to 20 ounces. Cleansing the digestive tract was another generalized remedy followed with or without much caution, using such purgatives as rhubarb, manna with tincture of serma, or Rush's favorites, jalap and calomel, emetics such as ipecac and antimony, and enemas of varying formulation.19

The following order for medicines and hospital stores for Fort Meigs placed by Brig. Gen. William Henry Harrison during the War of 1812 suggests that the same kinds of medicines remained popular for many decades:

Peruvian bark (in powder)

50 lb.


10 lb.


10 lb.


5 lb.

Corrosive sublimate

2 lb.

Tartar emetic

2 1b.


2 lb.


10 lb.


17 lb.

Rhubarb (in powder)

10 lb.


15 lb.

Colombo (in powder)

20 lb.

Nitre crude

20 lb.

Nitre sweet spirits

40 lb.

Glaubers salts

50 lb.

Prepared chalk

20 lb.

Castor oil

12 gal.

Olive oil

5 gal.

Gum arabic

20 lb.


5 lb.


20 lb.

Adhesive plaster

20 lb.


2 bbl.


300 lb.


50 lb.

Blistering ointment

20 lb.


20 lb.

Muriated acid

4 lb.

Sulphuric acid

4 lb.

Nitric acid

4 lb.


5 gross



3 sets


3 sets


3 sets

Cases scalpels (No. 6)



3 doz.


12 sets


7 lb.


1,000 yd.


200 gal.

Brandy or rum

100 gal.


200 gal.


200 gal.


300 lb.

Hyson tea

50 lb.


5 bbl.


5 bbl.


50 lb.

SOURCE : William Henry Harrison to Secretary of War, 30 Jun 1813, Harrison, Messages, 2: 486.


Among the newer ideas in medicine was the belief in the general wholesomeness of fresh air. Benjamin Franklin was among its most ardent supporters. Allied with this was the newly popular "cooling regimen in fevers," involving not only cool fresh air but also the bathing of fever patients in cold water. Yet another generalized remedy of recent origin was mercury, used earlier against venereal disease and as a purgative, but now also used as an alterative to treat many diseases, often in the form of calomel or the reputedly better tasting but more nauseating corrosive sublimate. Its new popularity was, an American physician boasted, "in its origin exclusively American, and ... to our colonial physicians the world is indebted for one of the greatest improvements ever made in practical medicine." Mercury was increasingly prescribed after 1750 for diseases classified as inflammatory, particularly pleurisy, pneumonia, and rheumatism, but it was also eventually used for typhus, yellow fever, dysentery, smallpox, tuberculosis, dropsy, hydrocephalus, and diseases of the liver, with Rush in the forefront of its many enthusiasts.20 Another noted American physician of the period, David Ramsay, explained the working of mercury by claiming that it set up an artificial illness, "transferring diseases of the head, of the eyes, and of the bowels to the mouth, where they are less dangerous and more manageable"21 in line with the principle put forward by John Hunter that "no two fevers can exist in the same constitution, nor two local diseases in the same part at the same time."22 Others accounted for the action of mercury by its weight, saying that mercury compounds expelled "morbid matter" from the digestive system and cleared out the glands, particularly the salivaries, and the blood vessels, promoting better circulation and eliminating disease.23

It is not likely that there were any physicians at that time unaware of the unpleasant side effects of mercury, since even laymen could recognize them and physicians were at times forced to order smaller than usual doses lest the patient realize what he was receiving and refuse to take it. Diarrhea, bleeding gums, nosebleeds, and loosening of the teeth were among the consequences the American physician John Warren described as "frequently troublesome and at times alarming."24

A far more pleasant eighteenth century remedy was wine, the use of which was most prevalent from 1700 to 1900. It was used


as a stimulant to the appetite, a sedative, a diuretic, and merely as a nutritious addition to the diet. Particular wines might be prescribed to relieve specific symptoms, with or without the addition of spices, fruit juices, and grated rinds. An astringent red wine was recommended for diarrhea, port for anemia and acute fevers, claret and burgundy for anorexia, and champagne for nausea and throat ailments. White wines were thought to make fine diuretics, while fortified wines in general were prescribed for convalescents.25

Opium, often in the form of laudanum, came to be used frequently in the late eighteenth century, to the point where it was "generally found in most decent families for domestic prescription." Samuel Bard, a noted colonial physician, had studied the effects of opium on the human body and recorded that it acted on the nervous system rather than the blood, slowed bowel action, suppressed urine, lessened pain and spasms, and brought sleep.26

Quite aside from the state of the art of diagnosis and treatment, the hospital in which he found himself played an important role in the fate of the sick or wounded soldier and, indeed, in the long run, in the fate of the army itself. The wars of the eighteenth century and particularly the Seven Years' War between France and Britain, therefore, inspired much writing by experienced army surgeons of both nations on the military hospital and its management. Considerable emphasis was placed upon the need for planning ahead for the number and types of hospitals required. The Frenchman Hugues Ravaton, considering this problem in 1768, wrote that one could assume that three of every 100 soldiers would be ill at the beginning of a European campaign. Halfway through the campaign, a probable five or six of 100 would be out of combat because of disease, and by the end of a campaign, if the victims of venereal disease and nonbattle injuries were counted, ten to twelve of 100 would be unable to fight because of illness. A day's battle would produce, he estimated, ten wounded per 100 combatants, but this percentage would drop as the number involved approached 100,000.27

Armed with this type of information, an army medical department could predict the number of general and regimental hospitals needed for the predictable number of patients. The general hospital remained in one place and was staffed by the physicians and surgeons of the medical department, as was the flying hospital in the British Army, which accompanied an army as it moved about. The regimental hospital was directed by the regiment's surgeon and mates, and its patients could be sent back to the general hospital, as were those of the flying hospital when it had to move. Considerable controversy raged at times between the proponents of the general hospital and those of the small regimental one. The infection and bad air of the general hospital caused many authorities to favor the smaller facility or at least to urge that the general hospital be divided among several locations. Locating hospitals so that the sick or wounded soldier would have to be moved for any great distance was also considered inadvisable.28

In the management of the individual hospital, eighteenth century military experts emphasized the necessity for good order, good air, and careful sanitation, as well as for the proper staff, which should include physicians, surgeons, mates who functioned as assistants to either physicians or surgeons, apothecaries, nurses, and a purveyor to handle supplies. Although the principle was not always observed in the British Army, Monro in particular emphasized that the direction and the purveying for a hospital should never be assigned to the same man, "as the Temptation of Accumulating Wealth has at all Times, and in all Services, Given Rise to the Grossest Abuses," a truth of


Hale's Sketches of Plans for Ventilators. (Coutesy of National Library of Medicine.)

which the organizers of the Continental Army's first medical department should have been more conscious. Also required was a guard to keep order and to see that patients obeyed the orders of the physicians, particularly in regard to such matters as drinking and receiving visitors. An inspector general should be assigned to visit each hospital at least once a week to ensure proper care and discipline.29

Good air supply could be achieved by using large rooms with high ceilings, by avoiding overcrowding, and even by using the newly developed, manually operated Hales ventilators, by means of which "foul air may be removed from hospitals," by burning frankincense, juniper wood and berries, or sulfur, or by using the steam from vinegar. Good sanitation also required proper care of privies, frequent scrubbing of walls, floors, and bed frames, and thorough airing or even impregnating woolen clothing and blankets with the fumes of muriatic or nitric acid or of burning sulfur. Straw rather than feathers was preferred for filling mattresses because of the difficulty of cleaning feather beds.30

Contagion could also be prevented by putting surgical and venereal disease patients into a separate room or even a separate hospital, away from those ill with fevers and dysentery, who should have their own well-aired wards and, hopefully, their own nearby privies. As much as possible, those with similar diseases should be kept together. Each patient should, in any case, be bathed and dressed in clean clothing before being admitted to a ward, and his hands and face washed routinely every morning. The physician visiting patients should wear clothing especially set aside for the purpose plus a "waxed linen coat to wear above them in going round to wards" and should never visit a hospital with an empty stomach if he wished to avoid becoming ill himself. When visiting patients with infectious diseases, furthermore, the physician should take tincture of bark before entering the room, put rolls of lint impregnated with camphorated spirits in his nose, place a bowl of camphorated vinegar near the patient he was visiting, and hold his breath while physically examining the patient, standing back from the bed when it was necessary to ask a question. Peter Middleton, another prominent American physician, suggested that the doctor in such instances might even hold tobacco in his


mouth to guarantee that he did not swallow saliva which might have become infected while he was in the sickroom.31

Preventive medicine played an extremely important role not only within the military hospital but also throughout the entire eighteenth century army. Increasing attention was now also being paid to the idea of being sure that the soldier was reasonably healthy before he entered the service, especially since it was recognized that the stress of entering the army made the new recruit particularly susceptible to disease. It had long been realized that, since disease was caused by "changes in the sensible qualities of the air, excesses in dirt, and irregularities in exercise," effective prevention of disease also required the close regulating of the soldier's everyday life and environment once he had officially joined the army. The choice of a campsite was of the greatest importance, with damp areas being particularly undesirable. Thick forests were also to be avoided because of the restricted circulation of air there. Where the army was living in tents, the straw used for bedding must be quickly changed should it become damp, while officers should spread waxed cloth on the damp ground under their beds to keep out the moisture. When it was raining, the tents should be tightly drawn to lessen water penetration and drainage ditches should be made around them. Campsites and barracks should, of course, be kept clean and free of accumulated garbage. Privies should be built near the camp, preferably over running streams. Even after exercising all these precautions, however, the army should avoid camping too long in one place, especially "when bloody flux prevails."32

Concern about the soldier and the weather extended to the soldier's clothing. Since sudden temperature changes, as occurred when the soldier left a warm hut to go outside in the winter, were considered particularly dangerous, it was considered wiser in cold weather to keep the men warm with extra garments and blankets than with the creation of great heat by stoves or fires. Flannel rather than linen, some authorities said, should be worn next to the skin, especially, of course, in the winter. At least one American authority strongly favored wearing flannel underclothing even in the summer, since it would prevent chilling if the temperature were to drop markedly during the night. John Jones, one of America's foremost army surgeons, however, believed that lighter clothing should be worn in hot weather and that linen, washable and cheaper to buy, was preferable to flannel.33

Concern went beyond the soldier's camp and clothing to his personal hygiene. It should be required that "no soldier be permitted to ease himself anywhere about the camp except in the privies." He should bathe


frequently, in the running water of a warm stream when possible, and wash his clothes as often as he could. He should keep his hair short or at least greased and combed daily to avoid the buildup of dirt and perspiration on his head.34 Pringle believed that even the amount of sleep the soldier received should be regulated, since "when soldiers are off duty, they sleep too much, which enervates the body, and renders it more subject to diseases." Adequate exercise was also most important.35

Close supervision of what the soldier ate and drank was necessary. Ideally he should drink only from the center of a moving stream or from wells giving pure water. When this was impossible, impure water should be mixed with vinegar or with chalk or alum and then filtered. By the early 1800's, the presence of "animalcules" in swamp water was recognized and boiling as well as straining of questionable water was urged. For hot weather use, Jones suggested adding vinegar, highly regarded by many for its supposed ability to preserve the health in the summer. This mixture would "serve to correct in some measure, the natural tendency of the humans to corruption, at that season." The drinking of beverages stronger than water, however, caused considerable discussion. Monro recommended diluted spirits for soldiers scheduled for night duty in cold weather, and Jones agreed that something stronger than water or "small beer" was necessary for men long exposed to the cold and damp. The American Army surgeon Edward Cutbush, however, warned against giving out undiluted spirits to soldiers and suggested that beer be substituted for stronger liquors.36

Defective diet was not classified as among the most important causes of illness. Many believed that, by having to contribute his food allowance to a mess and thus being unable to squander it on liquor, the soldier was guaranteed an adequate diet. Some concern was shown, however, that too much meat was being eaten in proportion to fruits and vegetables in the diet. Sugar was recommended as a method of making vegetables taste better, so as to lower the consumption of meat. Others speculated as to whether fresh fruit caused dysentery, Pringle believing it did not. Because of the concern for the possibly harmful effect of meat, the proportion of it given to hospital patients was restricted for those on "low" and "middle" diets. By the early years of the nineteenth century, concern was also being shown by Americans for the health of the meat animals the Army used for its messhalls.37

The only important deficiency disease generally recognized at this time was scurvy. By mid-century, James Lind had conducted experiments which showed to his satisfaction that oranges, lemons, and limes cured even the severest scurvy. By the time of the American Revolution, van Swieten had recognized that the eating of plenty of ripe fruits and vegetables could prevent this condition, but he and a surprising number of authorities, including even Lind himself, continued to assume that additional factors were also involved. Lind noted the importance of "warm, dry, pure air, with a diet of easy digestion, consisting chiefly of a due mixture of animal and vegetable substances," adding that "a glass of good sound beer, cyder, wine, or the like fermented liquor" would be advisable.38 Van Swieten, as late as 1776, blamed scurvy on "Noisome vapours, arising from marshy grounds, and stagnating waters, inaction, drinking of corrupted and stagnating waters. . . . damp and low lodging" as well as "scarcity of green vegetables, . . . the use of salted and smoaked [sic] flesh and fish, and of cheese too old and acrid." A number of others writing in the period immediately preceding the Revolution expressed themselves along similar lines.39 Nevertheless, Lind urged that the army plant "antiscorbutic plants" where it was garrisoned, suggesting, among others,


"garden cresses," and in 1776, the Continental Congress told its Medical Committee to be sure to have sufficient antiscorbutics on hand for the army operating in the North. In 1808, an army physician commented that if fresh fruits were impractical, an essence could be made of oranges and lemons to be used as a preventive for scurvy.40

The only disease for which prevention in the form of immunization was available was smallpox. Inoculation, the deliberate introduction into the body of material infected with the smallpox virus, thereby causing a mild case of that disease, was widely used during wartime by the British Army even before the American Revolution, after its initial introduction in both England and the colonies in the 1720's. The procedure ran into much opposition, especially in New England, in part because the person inoculated became a source of infection for all those with whom he came in contact. Many areas, particularly in New England, hoped to avoid smallpox merely because of their isolation, yet the ultimate result of this attitude was the appearance in the American Revolutionary Army of a large number of men who were not immune to the disease. The method of inoculation most commonly used in the colonies at this time was the Sutton or Dimsdale method, named after the Englishmen who developed it. As adopted in the colonies, this method required a two-week preparation period before inoculation, during which the patient was put on a diet of light and nonstimulating foods, dosed with mercury and antimony, bled, and purged. The inoculation itself was done by means of puncture rather than incision, as had been customary earlier, and on the leg, so that it would be as far as possible from the head and other vital areas. The patients were put on a "cooling regimen," exposed to cool air, and permitted to drink cool water while suffering from the disease which, acquired in this manner, was reputed to have become "an innocent disease" with a death rate of one in 1,000, compared to a rate of one in ten for smallpox caught by unintentional exposure. Nevertheless, John Cochran, soon to gain prominence as the fourth Director General of the Army Medical Department, wrote in 1772 of his concern that the mercury given during the smallpox inoculation process could lead to other diseases once the patient had recovered from the smallpox itself.41

It was not until the very early nineteenth century that Dr. Edward Jenner's method of preventing smallpox by vaccination with the cowpox virus became popular. By 1800, Jenner had vaccinated about 6,000 patients, and by 1804, his method had proved itself by causing a dramatic drop in smallpox deaths in both London and Vienna, without bringing with it the risk of a general epidemic. Although epidemics of smallpox continued to appear in the United States even after the introduction of vaccination in 1800, by 1812 the procedure had been accepted throughout most of the nation as a replacement for inoculation with smallpox, and in May 1812, shortly before the outbreak of the War of 1812, orders were issued to Army surgeons to vaccinate the entire Army using material from cowpox vesicles.42


While "Medicine ran into the cul de sac of therapeutic nihilism . . . , surgery, with all its imperfections . . . , could and did cure with some confidence" in the eighteenth century. By 1750, surgery was a respectable profession. The work of John Hunter, who was described as the first Englishman to raise surgery to the level of a science, added to its respectability, while the many wars of the eighteenth century were giving surgeons opportunities for observations and study which they otherwise would never have had. American John Jones urged that surgeons acquire as much learning and


training as possible43 and took particular exception to the idea he had heard suggested that surgeons should operate only under the direction of physicians, becoming merely "surgical machines . . . under the direction of their medical masters." Jones also preferred that medicine and surgery be combined, as it was to such a great extent in the colonies.44

A surprising variety of surgical procedures was occasionally performed by a few eighteenth century surgeons, but such procedures as lithotomy, or cutting for bladder stones, setting of fractures, reduction of dislocations, and amputations were very common. The skill with which Jones was able to perform lithotomies when he returned from his training abroad removed from this operation the bad name it had acquired in the colonies. Jones was reported to be able to perform a lithotomy in three minutes and at times in one and a half, using a lateral perineal approach. Other surgeons of the time wrote of dealing successfully with torn tendons, hydrocele, various types of hernias, and fistula lachrymalis, or abnormal openings in the tear duct. In France, Louis XIV's successful surgery in 1686 for anal fistula had inspired confidence both in French surgeons and in this operation, while the British surgeon William Cheselden had attracted attention with, among other accomplishments, his successful operations for cataract.45

Isolated reports of successful gynecological and obstetrical surgery began to appear in this period, proof that abdominal surgery was not always undertaken in vain. American John Bard reported in 1760 on surgery performed ten years earlier in which the patient survived when he removed a dead fetus resulting from an abdominal pregnancy. In the following years there were other scattered reports in the colonies of successful surgery performed for ectopic pregnancies, and in 1809, the Edinburgh-trained American physician Ephraim McDowell performed his famed ovariotomy. By the time of the American Revolution, Jones could write of the proper handling of penetrating wounds of the abdomen with enough confidence and in sufficient detail to make it obvious that he believed survival from such wounds to be possible. He described the type of suture he used for wounds in the intestine and discussed the removal of sutures when the injury had healed, as if the patient's survival to this point were not a total surprise. Among other achievements in eighteenth century surgery was the work of William and John Hunter, with whom a number of the Continental Army's surgeons studied, in developing a new surgical technique for handling aneurysms, one soon followed by others. There were also rumors in the 1760's of a splenectomy so well executed that the soldier-patient was able to return to duty. A successful appendectomy was performed in 1759, although apparently little notice was taken of it. Furthermore, before the end of the century, the indomitable John Hunter was experimenting with organ transplants in chickens, moving the testicles and spurs of cocks from one bird to another. He also noted that "Teeth, after having been drawn and inserted into the sockets of another person, united to the new socket which is called transplanting."46

Regardless of the nature of the injury, the medical care the surgical patient was likely to receive ran along familiar lines: "moderate evacuation, by bleeding, and gentle purging, together with a low diet."47 In addition, "when the wounded person has not suffered any great loss of blood, it will be advisable to open a vein immediately and take from the arm a very large quantity, and to repeat bleeding, as circumstances require, the second, and even the third day." To be considered in the making of decisions on bleeding the surgical patient, Hunter believed, was the violence of the inflammation, the power of the patient's body to make


blood, the distance of the injury from the sources of circulation, the nature of the part injured, and the duration of the inflammation.48

Among the potions most highly regarded by military surgeons were opium and bark, usually peruvian or cinchona bark, the source of quinine, praised in almost identical terms by both Jones and British Army surgeon John Ranby, who extolled "the sovereign and almost divine power of opium; next to this I likewise add bark, a medium which no human eloquence can extol with panegyric proportioned to its inestimable victories." Of bark he added, "I have known it to procure rest, if given in large doses, when even opium had been taken without any manner of effect."49 Van Swieten regarded the bark as the best available remedy against gangrene, and John Jones maintained that it contracted the blood vessels, "restoring their due action upon the blood, when too great a quantity of that necessary fluid is lost by profuse haemorrhage, provided the larger wounded vessels are secured by a proper ligature from future bleeding." On the other hand, "where there is a great fulness, or too much strength and contractile powers in the solids, and an inflammatory state of the system . . . the bark is not advisable."50

Excess bleeding from a wound was one of a number of problems with which eighteenth century surgeons wrestled. Although crude tourniquets were used as early as the sixteenth century to arrest hemorrhage before and during surgery, a new type of tourniquet, developed by French surgeon Jean Louis Petit in 1718 and tightened by means of a screw, was now being employed. The importance of proper ligation of major arteries was well recognized. The use of ligatures did not become popular for almost a century after their initial appearance, however, partially because amputation above the knee was not often performed, surgeons being well aware of the fact that the danger to the patient increased with the level of the amputation, and because the cautery was sufficient to control bleeding from small-caliber vessels. Closure of the wound was achieved with a variety of sutures. John Hunter preferred to use, wherever possible, the dry suture, since otherwise additional points of infection could appear at the site of each stitch, but he pointed out that the dry suture was not appropriate for penetrating wounds. Four other types of suture were also used after mid-century: the


twisted, then used in harelip repair and for wounds presenting similar problems, and the interrupted, quilled, and glovers' sutures, this last almost entirely for wounds of the intestines or stomach. Both interrupted and quilled sutures were used with abdominal wounds, with the latter preferred. Jones recommended the removal of sutures "as soon as the union is complete, which generally happens either the second, or third day, often in twenty-four hours." The double-flap technique for amputations had already been developed, but the promotion of healing by the first intention was considered to require great care and forethought.51 It was suggested by Hunter for "wherever a clean wound is made in sound parts and where the surfaces can be brought into contact, or where there is sufficient skin to cover the part." On the other hand, wounds involving lacerations might be impossible to handle by the first intention, while wounds which might contain foreign matter should be left open so that pus could bring any debris to the surface.52

The care of the patient after surgery was carefully supervised both to ensure appropriate care of the wound itself and to note and treat possible complications. Hunter believed that dry lint should not be used as a dressing. Poultices were called for when it was desired that the wound suppurate. Where poultices were not appropriate, he maintained that dry lint was likewise inappropriate. He suggested in these cases that oil and wax or some other "unctuous matter" be used, or that it might even at times be better not to use ointments. Jones praised this British conservatism and added that, if an ointment had to be used, it should at least be very mild. The use of wine and alcohol as a remedy against putrefaction, however, was recommended by van Swieten, while turpentine, used in the eighteenth century to combat minor bleeding, now appears to have real value against bacteria.53

The appearance of "laudable pus" was regarded as a good sign, since suppuration was considered to be the body's attempt to rid itself of harmful materials. Pus caused a "very happy effect, by separating the lacerated vessels and extravasated fluids from the sound parts which then grow up a-fresh. Hence laudable pus is esteemed by surgeons as one of the best signs." Redness and heat around a wound were seen as inevitable, as was fever in serious wounds.54 It was expected that the various healing processes might bring on a condition called "the hectic," the result of the body's trying unsuccessfully to heal itself and characterized by "debility, a small, quick, and sharp pulse; the blood forsaking the skin; loss of appetite; often rejection of all aliment by the stomach; wasting; a great readiness to be thrown into sweats; sweating spontaneously when in bed; frequently a constitutional purging; the water clear."55 Once "digestion of the wound" had taken place, fever, inflammation, and pain could be expected to lessen. The average eighteenth century surgeon was so unaware of the causes and effects of infection that, when a colleague achieved an unusually low mortality rate, the explanation was sought in his surgical techniques and not in his standards of cleanliness.56

Among other possible complications arising in the wounded soldier was tetanus. Thomas, in 1815, admitted the difficulty of bringing a patient through tetanus and stressed instead attempts to prevent it. He cited among these efforts the Navy's custom of using a wound dressing containing tincture of opium and the use by a Dr. James Clark of calomel before and after surgery as a preventive. Rush noted in 1787 the vast quantities of opium then routinely used to treat tetanus, but stated that he personally had effected a gradual but complete cure by administering two to three ounces of bark and up to three pints a day of wine and raising a blister between the patient's


shoulder blades to which he applied a mercurial ointment.57

The types of surgery most commonly performed in the Army involved, of course, gunshot wounds and their consequences. By the late seventeenth century, when it was realized that the ball was not poisonous, surgeons were urged not to probe deeply but rather to let the ball remain if it could not be located easily. Jones preferred to dilate wounds to facilitate drainage or to aid in the reduction of fractures, but Hunter, believing that the edges of a wound were quite elastic, concluded that enlarging the opening was not required unless there were severe hemorrhaging, a suspected skull fracture, or debris or bone fragments to be removed. Many surgeons now believed that the caustic dressings once popular for gunshot wounds were no longer necessary.58

When battle injuries involved fractures, the question of amputation arose, many surgeons favoring immediate amputation in compound fractures. Such injuries, when treated in crowded hospitals, were all too frequently followed by infection, making amputation eventually necessary even though similar fractures, treated in a rural environment, might never require such drastic measures. Van Swieten and John Hunter preferred to postpone amputation until it was obvious in each instance that it would be required. Amputation did not necessarily save the patient's life, however. The mortality rate was often 45 to 65 percent where the leg was removed at mid-thigh, although some surgeons lost markedly fewer patients than this. Jones believed that immediate amputation was definitely advisable when the heads of bones were broken or capsular ligaments were torn. When the fracture involved the skull, the danger posed to the brain by excessive pressure was well recognized and the eighteenth century surgeon was prepared to trephine. The patient would be fortified against these ordeals by the administration of opium and, perhaps, rum and his ears filled with lamb's wool to deaden the sound.59

Although they were at times willing to attempt a wide range of surgery, eighteenth century surgeons generally acknowledged their inability to save patients with wounds involving the heart, aorta, cerebellum or medulla of the brain, or the cisterna chyli. Wounds of the spinal cord, major blood vessels, or the principal organs of the chest or abdomen were considered very dangerous. Amputation at the shoulder was generally avoided because of the danger of hemorrhage. And, of course, anyone could become fatally infected, a fact of which army surgeons in particular were only too well aware.60


Increasing numbers of young American medical students were studying abroad; their training added to the prestige of their profession in the colonies, although specialization in the European manner was rendered impractical by the generally scattered population of the colonies which required the physician to become a medical jack-of-all-trades, physician, surgeon, and apothecary all in one. The medicine practiced by some of these men carried purging and bleeding beyond the extent customary in Europe, and although physicians with European training were at times most outspoken in their criticism of the poor quality of American medicine, it was often they who, like Rush, went to those extremes which we today so deplore. The social status of physicians at this time could be great; they were found in colonial legislatures and local assemblies, on college boards, and among the wealthy merchants and landed proprietors so influential in colonial society.61

It has been estimated that by 1775 approximately 3,500 physicians were practicing in the colonies, caring for a popula-


tion of three million, but that only 400 had M.D. degrees from medical colleges. By 1776, the two medical schools then functioning in America had conferred only 51 M.D. degrees. A majority of colonial physicians were either self-taught or taught through apprenticeships; and of the approximately 1,200 medical practitioners serving with the Continental Army in the Revolution, only an estimated 100 had M.D. degrees. Among the leaders of the Army's Hospital Department, however, were some of the best trained physicians in the colonies.62 (See Appendix A.)

Training by apprenticeship was a highly individualized process which varied with the preceptor. Some required their apprentices to have a knowledge of Latin, natural history, mathematics, and grammar, but others were less demanding. Some apprentices might spend a large part of the traditional seven-year training period doing menial and semimenial tasks, receiving at the end of that time a certificate which was almost meaningless.63

As the colonies became increasingly prosperous, greater numbers of young Americans, including many future leaders in the Continental Army's Hospital Department, went abroad for their medical education. Proper preparation for such study was considered to include a college education followed by three years of apprenticeship to a physician who himself was well educated. The British schools, which were at the height of their popularity before the American Revolution, included Edinburgh, where clinical instruction was emphasized. Paris, where surgery was increasingly respected, became more popular with Americans after the French Revolution. Colonial students at this time usually studied in Paris only after having received their M.D.'s, as did John Morgan, William Shippen, Jr., and Benjamin Rush after receiving their degrees from Edinburgh. John Jones earned his M.D. at Rheims before moving on to Paris to study under eminent French surgeons. A few individual European practitioners were especially influential, among them, of course, Cullen of Edinburgh. In London, the Hunter brothers, William and John, also played important roles in the lives of many American medical students, including not only Rush and Jones but also Morgan and Shippen, both of whom were later Director Generals of the American Army Medical Department during the Revolution. John Pringle was another strong influence upon young American students of surgery.64

The first form of organized classes in medicine in the colonies was a system of lectures and demonstrations, often managed by a physician whose training abroad led him to feel acutely the need for better medical education in the colonies. Classes in human anatomy, however, were highly unpopular among ordinary citizens, who associated dissection with body snatching and regarded it as a desecration of the human body. Even postmortem examinations were infrequent and their purpose was limited to determining cause of death when murder was suspected. Nevertheless, in 1762, Shippen began a series of anatomy classes involving both a human body and a series of anatomical plates and casts donated by a prominent London physician. His first class held only ten students and triggered a minor riot, Shippen being accused, despite his denials, of grave robbing. The popular assumption that dissection of the human body implied body snatching lasted at least until the 1788 Doctors' Mob riot in New York City when three days of violence, put down only by military force, inspired the first practical laws regulating such matters.65

The first American medical school was established in Philadelphia in 1765 by Morgan. His failure to consult Shippen, with whom he had formulated the early plans for the school while both men were studying at Edinburgh, added fuel to the fire of the enmity already growing between


the two men. Morgan was given the first appointment as professor at the new school where Shippen, who had been teaching for several years in Philadelphia, and later Rush, among others, joined him. Students at this school were required to have studied Latin, Greek, mathematics, and philosophy, and Morgan strongly recommended a familiarity with French as well. By the turn of the century, three more medical schools were in operation in the United States, at Kings College, Harvard, and Dartmouth, all with faculties in which foreign-trained physicians predominated.66

By the time the Revolution broke out, American physicians were no longer isolated from the thought of their colleagues, either those in their own communities or those in Europe. Books and journals were available to them, and scientific societies in the more populous areas gave them a forum where they were encouraged to present their own ideas. Here again, however, the shadow of things to come was cast by the formation in Philadelphia of rival societies by groups led on the one hand by Morgan and on the other by Shippen. Few of the medical publications in circulation in the eighteenth century were of American origin, but a number of public and private libraries were being created and foreign authors became very influential, among them not only the British Pringle and Brocklesby but also the Austrian van Swieten, French surgeons such as Jean Louis Petit and Henri François Le Dran, and many others. The influence of French military medicine also appeared in the form of translations of the French Journal de Médecine Militaire which were circulated in the United States from 1783 to 1790.67

Some of the men who were to join the American Army Medical Department had gained limited experience in military medicine during the French and Indian War. Morgan, for example, was an army surgeon with the Pennsylvania militia at that time and John Jones had taken part in the British attack on Canada. Experience with the British exposed some of the apprentice-trained colonial physicians to a higher level of practice than they had known before and familiarized them with the lower level operations of an army medical department. During the Revolution, even such men as Benjamin Rush, who had no previous military experience, expressed a wish to see the American Army Medical Department modeled more closely upon its British counterpart.68

During this period, the British Army's medical department was headed by a council of three civilians, usually prominent London practitioners, functioning as Physician General, Surgeon General, and Inspector General. Under them an inspector general was assigned to every twenty- to thirty-thousand-man army, where yet another inspector general served each division of four to five thousand men. The principal, or general, hospital for such an army might have 400 beds and a staff of six medical officers (including a physician and two assistants, a surgeon and one assistant, and an apothecary) and fifty other workers. The need for a sergeant to keep order among hospital patients was first recognized during the French and Indian War, and two orderlies were appointed who could assist in maintaining discipline. The British also emphasized hospital sanitation, good air and ventilation, and the prevention of overcrowding. It was intended that each regiment have its own surgeon and, in wartime, two assistants, and its own hospital in a house or tent. Fairly high standards were used in the selection of physicians, but regimental surgeons bought their commissions without showing any qualifications except that of experience, although in this instance, after the French and Indian War, higher standards were being urged. Regimental surgeons serving the Americans during that war tended to be poorly trained. The sick and


wounded at times went untended for days, no efficient system of evacuation was developed until the Napoleonic Wars, and supplies and food were inadequate. The troops occasionally suffered from jaundice, and there was no evidence that the problem of typhus was handled any better by the British then than by the Americans during the Revolution.69

The practice of military medicine in the eighteenth century by nations experienced with the demands of war left much to be desired, even by the standards of the time. It could not, therefore, logically be expected that the newly created Continental Army Medical Department, directed by relatively inexperienced men whose personal antipathies toward one another were rapidly growing, could, in the early years of the Revolution, function in a manner we would find acceptable today.