- Frances Bengtson
During the American Civil War, the United States faced a scale of casualties and disease that challenged medical infrastructure in a way previously unimaginable in U.S. history. Beyond the hundreds of thousands of soldiers who lost their lives during the conflict, it is estimated that over 280,000 military personnel were non-mortally wounded, in the Union Forces alone. In the face of such unprecedented suffering, the nation’s medical system was forced to adapt rapidly. While surgeons and field hospitals are often the public face of Civil War medicine, much of the essential record keeping that sustained medical care fell to hospital stewards. These noncommissioned officers quietly documented the war’s medical toll, maintained hospital registers, tracked supplies, and created the administrative backbone of the wartime medical system.
Hospital stewards were enlisted men chosen for their literacy, reliability, and ability to manage detailed tasks. The position had existed prior to the war, but the massive scale of the conflict expanded both its scope and importance. The position was not easy to acquire. According to Joseph Janvier Woodward’s Hospital Stewards's Manual (1862), stewards “must have…sufficient knowledge of…pharmacy to take charge of the dispensary, acquainted with minor surgery, …application of bandages and dressings, extraction of teeth, application of cups and leeches, …knowledge of cooking” and must be “industrious, patient, and good tempered”; qualities vital for men responsible for managing sensitive medical records and inventories. Charles F. Beal, the acting assistant surgeon at Dunbarton Street Hospital in Georgetown and a hospital steward hopeful, even noted that he had gathered recommendations from three surgeons during the preparation of his application. There was even a competitive exam to qualify for the position after which the appointment had to be confirmed by the Secretary of War. Unlike many soldiers, who may have been illiterate or unskilled in clerical work, stewards were often extremely educated with ample prior experience.
Thus, the duties of hospital stewards went far beyond dispensing medicine. Stewards kept admission and discharge registers, updated daily reports, and compiled statistics on disease prevalence, mortality rates, and surgical outcomes. Stewards also tracked inventories of medical supplies, linens, bedding, and food. By maintaining accurate records, they ensured that hospitals could requisition supplies before shortages became critical. This was especially important during this era because infection and disease killed more soldiers than died on the battlefield. Maintaining proper supplies was quite literally lifesaving. In addition, stewards were responsible for documenting medical experiments, treatments, and even autopsies. Surgeons often dictated notes, but stewards copied them into official hospital records and these case notes became valuable data for military medicine.
The importance of hospital stewards did not end with the conclusion of the war. Their record keeping provided the foundation for postwar medical analysis, including the Medical and Surgical History of the War of the Rebellion, one of the most comprehensive medical reports ever produced in the nineteenth century which remains a critical source for historians and medical researchers alike. Beyond preserving data for posterity, these records also influenced veterans’ care, helping to document injuries and illnesses for pension claims and ongoing treatment. By translating individual patient experiences into organized, systematic documentation, hospital stewards ensured that the knowledge gained during the war could inform both medical practice and public health for years to come.
Today, the work of Civil War hospital stewards can be recognized as an early step toward the standardization of medical record keeping. In modern day, medical records are the foundation of modern healthcare: they track patient histories, coordinate treatment among providers, and supply data for research. The Civil War demonstrated for the first time on a national scale that organized, centralized medical records were essential for both immediate care and long-term study. White the position of hospital stewards does not exist in modern day as a single individual, in many ways, hospital stewards were predecessors to modern medical records technicians, health information managers, clinical administrators, and more. Their patient registers foreshadowed today’s electronic health records, while their statistical reports anticipated epidemiology and outcomes research. By demonstrating the value of systematic record keeping, they helped shift medicine from an individual practice toward a data-driven science.
The Civil War forced the United States to confront medical challenges on an unprecedented scale. Hospital stewards, though often overshadowed by surgeons and nurses, were the unsung workhorses of the wartime medical system. Their record keeping made possible not only the daily functioning of Civil War hospitals but also the long-term advancement of American medicine.
References
AMEDD/NCO Enlisted Soldier History. (n.d.). Retrieved September 19, 2025, from Army.mil website: https://achh.army.mil/regiment/nco-historynco
Campbell, W. T. (2014a, October 24). Meet the Hospital Steward. Retrieved September 19, 2025, from Civilwarmed.org website: https://www.civilwarmed.org/surgeons-call/steward1/
Campbell, W. T. (2014b, December 24). Seven Hospital Stewards. Retrieved September 19, 2025, from Civilwarmed.org website: https://www.civilwarmed.org/surgeons-call/steward2/
Campbell, W. T. (2018). Overworked, Undermanned and Indispensable: HOSPITAL STEWARDS IN THE CIVIL WAR. Retrieved September 19, 2025, from Jstor.org website: https://www.jstor.org/stable/26483969?casa_token=UXGgtjx5AWcAAAAA%3AelF8zvUlDExlQW66JZ6kIx1W_9UpUaZnImNrz7tXI48KNILFP_I0upTXxemaI7adNWs02eftDbcr9M8UFwy7nQSDAF2ngh0bV7br0BYMJus0BAvNtpE&seq=2
Defense casualty analysis system. (n.d.). Retrieved September 19, 2025, from Osd.mil website: https://dcas.dmdc.osd.mil/dcas/app/summaryData/casualties/principalWars
The hospital steward’s manual: for the instruction of hospital stewards, wardmasters, and attendants, in their several duties : prepared in strict accordance with existing regulations, and the customs of service in the armies of the United States of America, and rendered authoritative by order of the Surgeon-General - Digital Collections - National Library of Medicine. (n.d.). Retrieved September 19, 2025, from Nih.gov website: https://collections.nlm.nih.gov/catalog/nlm:nlmuid-101526781-bk
Union Medical Civil War Facts. (n.d.). Retrieved September 19, 2025, from Mycivilwar.com website: https://www.mycivilwar.com/facts/usa/usa-medical.html
Civil War Medicine in the Modern Age
Assignments from a first-year Selective at Georgetown University School of Medicine.
Tuesday, September 23, 2025
Sunday, September 21, 2025
The Post-War Legacy of Civil War Dentistry
- Kavn Aulakh
The Civil War is often remembered for its devastating battles and medical crises. However, one greatly overlooked aspect of a soldier’s health was dental care. Toothaches, infections, and broken teeth plagued the armies. The Union Army entered the Civil War without a single commissioned dentist, leaving soldiers at the mercy of overworked and untrained surgeons for makeshift treatments. The Confederacy, faring only slightly better, experimented with dentists by 1864, but dentistry as a whole remained primitive by modern standards. While the war ended without an official military dental corps, the widespread suffering of soldiers—combined with the advocacy of wartime dentists and professional organizations—laid the foundation for systemic change. The legacy of Civil War dentistry can be traced directly into the creation of the Army and Navy Dental Corps in the early 20th century and the recognition of oral health as essential to both military readiness and civilian life.
The Civil War highlighted the devastating effects of neglecting dental care. Soldiers often entered service with compromised teeth, the result of poor hygiene and diets dominated by hardtack, salted meats, and excessive sugar. These conditions accelerated decay and gum disease. For most men, treatment for such pain meant extraction without anesthesia, usually performed by untrained surgeons with crude instruments. In the Confederacy, the conscription of civilian dentists into military hospitals by 1864 revealed both the immense demand for care and the benefits of professional treatment. These efforts demonstrated the direct impact of oral health on a soldier’s ability to fight, eat, and survive, and they planted the seeds for post-war reform.
Even before the war ended, organized dentistry began lobbying for recognition. In 1863, the American Dental Association, then known as the American Dental Convention, petitioned for a military dental corps, citing the impact of dental problems on soldiers’ efficiency. A prominent dentist, Samuel S. White, even met with President Abraham Lincoln to argue the case. However, Secretary of War Edwin Stanton dismissed the proposal as an unnecessary luxury amid wartime crises. Confederate dentists who had treated soldiers in makeshift hospitals continued to document the oral suffering they witnessed. Their reports ensured that dentistry remained part of the post-war conversation on military medicine, even though progress was slow.
The decades following the war saw very gradual change. In 1872, Dr. William Saunders was hired as the first Army dentist at the U.S. Military Academy at West Point. However, he was employed only as a contract “acting assistant surgeon” rather than a commissioned officer. Saunders is nevertheless regarded as the Army’s first official dentist. Through the late 19th century, additional contract dentists served at larger posts, but their presence was inconsistent and underfunded. Meanwhile, Civil War veterans, many of whom carried lifelong dental problems due to wartime neglect, were living reminders of the importance of dentistry in the military. Their hardships added weight to the argument that oral health was not a luxury but a necessity for both readiness and long-term health.
The tipping point came with the Spanish–American War of 1898, when the U.S. military once again faced high rates of preventable dental problems. Soldiers were sidelined by abscessed teeth or unable to chew their rations, reigniting the call for reform. By 1911, the U.S. Army officially established its Dental Corps, followed by the Navy in 1912. Dentists became permanent members of the commissioned medical staff, tasked with ensuring that oral health no longer undermined fighting capacity. This milestone can be traced directly back to lessons first learned during the Civil War.
The war also reshaped dentistry beyond the military. Wartime shortages forced practitioners to innovate with materials such as tin foil and amalgam, which later became standard alternatives to gold fillings. The postwar decades saw rapid advancements, including the invention of the foot-powered drill in 1871, the adoption of antiseptic practices, and the professionalization of dentistry through new schools and licensing standards. These changes helped dentistry shed its old reputation as the trade of “tooth-pullers” and emerge as a respected healthcare profession.
Today, the legacy of these reforms is clear in the military’s concept of dental readiness. Soldiers undergo comprehensive dental exams before deployment, and those with untreated abscesses or other serious conditions are barred from overseas duty until treatment is complete. This policy is a direct descendant of Civil War lessons, when poor oral health undermined soldier effectiveness.
While the Civil War ended without a formal dental corps, it fundamentally reshaped the trajectory of dentistry in America. The war exposed the cost of neglect, demonstrated the value of skilled practitioners, and galvanized post-war advocacy that eventually brought dentistry into the heart of military medicine. Beyond the battlefield, the war accelerated dentistry’s professionalization, transforming it into a respected and essential branch of healthcare. In both military and civilian contexts, the Civil War stands as the crucible in which modern dentistry was forged.
References
Dalton, Kyle. “The Myth of Two Teeth.” The Medical Record. National Museum of Civil War
Medicine. March 7, 2022. Accessed September 17, 2025. https://www.civilwarmed.org/two-teeth/.
Hyson, John M., Joseph W.A. Whitehorne, and John T. Greenwood. A History of Dentistry in the
U.S. Army to World War II. Falls Church, VA: Office of the Surgeon General, U.S. Army, 2008.
Murphy, Robert. “Smile! The Evolution of Dentistry During the Civil War.” HistoryNet.
November 18, 2021. Accessed September 15, 2025. https://www.historynet.com/smile-the-evolution-of-dentistry-during-the-civil-war/.
Rosenberg Library Museum. “Civil War Dental Surgeon’s Kit.” Treasure of the Month. Accessed September 17, 2025.
https://www.rosenberg-library-museum.org/treasures/civil-war-dental-surgeons-kit.
University Associates in Dentistry. “Spotlight on Dentistry in the Civil War.” University
Associates in Dentistry (Chicago). Accessed September 17, 2025. https://www.uadchicago.com/uncategorized/spotlight-on-dentistry-in-the-civil-war/.
The Civil War is often remembered for its devastating battles and medical crises. However, one greatly overlooked aspect of a soldier’s health was dental care. Toothaches, infections, and broken teeth plagued the armies. The Union Army entered the Civil War without a single commissioned dentist, leaving soldiers at the mercy of overworked and untrained surgeons for makeshift treatments. The Confederacy, faring only slightly better, experimented with dentists by 1864, but dentistry as a whole remained primitive by modern standards. While the war ended without an official military dental corps, the widespread suffering of soldiers—combined with the advocacy of wartime dentists and professional organizations—laid the foundation for systemic change. The legacy of Civil War dentistry can be traced directly into the creation of the Army and Navy Dental Corps in the early 20th century and the recognition of oral health as essential to both military readiness and civilian life.
The Civil War highlighted the devastating effects of neglecting dental care. Soldiers often entered service with compromised teeth, the result of poor hygiene and diets dominated by hardtack, salted meats, and excessive sugar. These conditions accelerated decay and gum disease. For most men, treatment for such pain meant extraction without anesthesia, usually performed by untrained surgeons with crude instruments. In the Confederacy, the conscription of civilian dentists into military hospitals by 1864 revealed both the immense demand for care and the benefits of professional treatment. These efforts demonstrated the direct impact of oral health on a soldier’s ability to fight, eat, and survive, and they planted the seeds for post-war reform.
Even before the war ended, organized dentistry began lobbying for recognition. In 1863, the American Dental Association, then known as the American Dental Convention, petitioned for a military dental corps, citing the impact of dental problems on soldiers’ efficiency. A prominent dentist, Samuel S. White, even met with President Abraham Lincoln to argue the case. However, Secretary of War Edwin Stanton dismissed the proposal as an unnecessary luxury amid wartime crises. Confederate dentists who had treated soldiers in makeshift hospitals continued to document the oral suffering they witnessed. Their reports ensured that dentistry remained part of the post-war conversation on military medicine, even though progress was slow.
The decades following the war saw very gradual change. In 1872, Dr. William Saunders was hired as the first Army dentist at the U.S. Military Academy at West Point. However, he was employed only as a contract “acting assistant surgeon” rather than a commissioned officer. Saunders is nevertheless regarded as the Army’s first official dentist. Through the late 19th century, additional contract dentists served at larger posts, but their presence was inconsistent and underfunded. Meanwhile, Civil War veterans, many of whom carried lifelong dental problems due to wartime neglect, were living reminders of the importance of dentistry in the military. Their hardships added weight to the argument that oral health was not a luxury but a necessity for both readiness and long-term health.
The tipping point came with the Spanish–American War of 1898, when the U.S. military once again faced high rates of preventable dental problems. Soldiers were sidelined by abscessed teeth or unable to chew their rations, reigniting the call for reform. By 1911, the U.S. Army officially established its Dental Corps, followed by the Navy in 1912. Dentists became permanent members of the commissioned medical staff, tasked with ensuring that oral health no longer undermined fighting capacity. This milestone can be traced directly back to lessons first learned during the Civil War.
The war also reshaped dentistry beyond the military. Wartime shortages forced practitioners to innovate with materials such as tin foil and amalgam, which later became standard alternatives to gold fillings. The postwar decades saw rapid advancements, including the invention of the foot-powered drill in 1871, the adoption of antiseptic practices, and the professionalization of dentistry through new schools and licensing standards. These changes helped dentistry shed its old reputation as the trade of “tooth-pullers” and emerge as a respected healthcare profession.
Today, the legacy of these reforms is clear in the military’s concept of dental readiness. Soldiers undergo comprehensive dental exams before deployment, and those with untreated abscesses or other serious conditions are barred from overseas duty until treatment is complete. This policy is a direct descendant of Civil War lessons, when poor oral health undermined soldier effectiveness.
While the Civil War ended without a formal dental corps, it fundamentally reshaped the trajectory of dentistry in America. The war exposed the cost of neglect, demonstrated the value of skilled practitioners, and galvanized post-war advocacy that eventually brought dentistry into the heart of military medicine. Beyond the battlefield, the war accelerated dentistry’s professionalization, transforming it into a respected and essential branch of healthcare. In both military and civilian contexts, the Civil War stands as the crucible in which modern dentistry was forged.
References
Dalton, Kyle. “The Myth of Two Teeth.” The Medical Record. National Museum of Civil War
Medicine. March 7, 2022. Accessed September 17, 2025. https://www.civilwarmed.org/two-teeth/.
Hyson, John M., Joseph W.A. Whitehorne, and John T. Greenwood. A History of Dentistry in the
U.S. Army to World War II. Falls Church, VA: Office of the Surgeon General, U.S. Army, 2008.
Murphy, Robert. “Smile! The Evolution of Dentistry During the Civil War.” HistoryNet.
November 18, 2021. Accessed September 15, 2025. https://www.historynet.com/smile-the-evolution-of-dentistry-during-the-civil-war/.
Rosenberg Library Museum. “Civil War Dental Surgeon’s Kit.” Treasure of the Month. Accessed September 17, 2025.
https://www.rosenberg-library-museum.org/treasures/civil-war-dental-surgeons-kit.
University Associates in Dentistry. “Spotlight on Dentistry in the Civil War.” University
Associates in Dentistry (Chicago). Accessed September 17, 2025. https://www.uadchicago.com/uncategorized/spotlight-on-dentistry-in-the-civil-war/.
Saturday, September 20, 2025
The Dawn of Anesthesia in the Civil War
- Leila “Izzy” Jones
By the time the American Civil War began, two anesthetic agents had been developed and were already in routine use in American hospitals. The options were diethyl ether and chloroform, the first substances capable of producing general anesthesia suitable for surgery. Their widespread use in the painful operations of the Civil War firmly established anesthesia as the standard of care and provided physicians with opportunities to refine its administration. Surgeons discovered optimal delivery techniques, recognized its limitations, and alleviated the suffering of thousands of soldiers undergoing lifesaving procedures. This era laid the foundation for anesthesiology to emerge as a dedicated specialty and evolve into the robust field of medicine it is today.
Diethyl ether had been discovered in 1525 by Paracelsus, yet its clinical utility remained unrecognized for centuries (Hargrave, 2025). Not until October 16, 1846, did Dr. William T. G. Morton publicly demonstrate its effectiveness at Massachusetts General Hospital in what became known as the “Ether Dome.” His patient, Edward Gilbert Abbott, inhaled ether vapor until unconscious, at which point Dr. John Collins Warren excised a vascular tumor from his neck. Abbott remained immobile throughout the procedure and later reported no pain. Morton, effectively the first practicing anesthesiologist, famously declared to the assembled audience of physicians and students, “This is no humbug!” (MassGen). The demonstration marked a turning point in surgical practice and introduced the world to the possibilities of pain-free operations.
Chloroform, however, would soon surpass ether in popularity, especially during the Civil War. First synthesized independently by several scientists in 1831, chloroform’s anesthetic potential was not recognized until 1847 (History of Anesthesia). That year, Scottish obstetrician Dr. James Y. Simpson and two colleagues experimented with various inhaled substances in search of an alternative less irritating and less flammable than ether. After inhaling chloroform vapors from handkerchiefs in Simpson’s living room, the trio abruptly lost consciousness. Upon regaining their senses, they realized its extraordinary potential. Simpson quickly incorporated chloroform into his obstetrics practice, easing labor pain for countless women. By 1853 Queen Victoria even received chloroform during childbirth, solidifying its respectability (SOAP). Compared with ether, chloroform offered rapid onset, reduced airway irritation, and, crucially for military settings, far lower flammability. These qualities made it the preferred anesthetic of Civil War surgeons working in field hospitals vulnerable to fire and explosion hazards (Dalton, 2020).
Although both ether and chloroform had been adopted enthusiastically in the United States and abroad, the Civil War was the first large-scale conflict to apply them systematically, thereby transforming anesthesia into the standard of surgical care. The unprecedented volume of operations, particularly amputations, allowed surgeons to refine their dosing strategies and gain a better understanding of anesthetic physiology. Chloroform was most often administered using a cone-shaped sponge or a rigid cone lined with a chloroform-soaked sponge. When held over the patient’s nose and mouth, unconsciousness was typically achieved within about nine minutes (Reimer, 2017). With careful administration of additional chloroform in the same manner, unconsciousness could be maintained for fifteen to thirty minutes, long enough for the majority of battlefield procedures.
Amputation was by far the most common surgery requiring anesthesia, with over 60,000 performed during the war. Approximately 95% of all operations utilized ether or chloroform (Reimer, 2017). At this volume of use, complications inevitably arose. Some patients received excessive doses too rapidly and never regained consciousness. Chloroform depresses cardiac and respiratory function, and in rare cases this suppression became fatal. Contemporary estimates suggest a mortality rate of about 5.4 deaths per 1,000 anesthetic administrations, often in patients already gravely wounded or otherwise unstable (Devine, 2016). Civil War physicians learned from these rare complications and grew adept at anesthetic delivery. Their collective experience helped establish anesthesiology as a discrete medical discipline in the postwar years.
The timing of anesthesia’s discovery was extraordinarily fortunate. It spared thousands of soldiers the agony of surgery and allowed surgeons to operate with greater precision, producing improved outcomes. Chloroform was even administered during the amputation of General “Stonewall” Jackson’s arm. As he drifted into unconsciousness, he is reported to have remarked, “What an infinite blessing” (“Battle of Chancellorsville”). Though Jackson later died from his injuries, many others survived due to the skilled use of chloroform, which became known among troops as a “soldier’s best friend during a painful surgery.”
Despite chloroform’s ubiquity, the image of Civil War soldiers “biting the bullet” during amputations persists in popular memory. This enduring myth likely reflects earlier practices, when patients might bite on leather straps or wood to endure pain before anesthesia became widespread. However, no reliable records exist of Civil War soldiers using bullets in this way, and physicians most likely would not have permitted such a dangerous choking hazard. Discoveries of bitten bullet casings are better explained by the gnawing of pigs on abandoned battlefield debris (Reimer, 2017). Combined with the enormous number of amputations, these artifacts gave rise to a legend of stoic heroism that has since overshadowed the historical reality that chloroform, not grit, spared soldiers from agony.
Ultimately, anesthesia proved to be a transformative medical innovation that arrived just in time for the Civil War. It enabled surgeons to perform complex procedures with greater accuracy while drastically reducing patient suffering. Its extensive wartime use generated a wealth of clinical experience, demonstrating that anesthesia was both safe and effective when administered properly. The Civil War thus solidified anesthesia as an indispensable component of surgical practice and accelerated the growth of anesthesiology as a specialty, laying the groundwork for the discovery of many more sophisticated pharmacologic agents in the decades that followed.
By the time the American Civil War began, two anesthetic agents had been developed and were already in routine use in American hospitals. The options were diethyl ether and chloroform, the first substances capable of producing general anesthesia suitable for surgery. Their widespread use in the painful operations of the Civil War firmly established anesthesia as the standard of care and provided physicians with opportunities to refine its administration. Surgeons discovered optimal delivery techniques, recognized its limitations, and alleviated the suffering of thousands of soldiers undergoing lifesaving procedures. This era laid the foundation for anesthesiology to emerge as a dedicated specialty and evolve into the robust field of medicine it is today.
Diethyl ether had been discovered in 1525 by Paracelsus, yet its clinical utility remained unrecognized for centuries (Hargrave, 2025). Not until October 16, 1846, did Dr. William T. G. Morton publicly demonstrate its effectiveness at Massachusetts General Hospital in what became known as the “Ether Dome.” His patient, Edward Gilbert Abbott, inhaled ether vapor until unconscious, at which point Dr. John Collins Warren excised a vascular tumor from his neck. Abbott remained immobile throughout the procedure and later reported no pain. Morton, effectively the first practicing anesthesiologist, famously declared to the assembled audience of physicians and students, “This is no humbug!” (MassGen). The demonstration marked a turning point in surgical practice and introduced the world to the possibilities of pain-free operations.
Chloroform, however, would soon surpass ether in popularity, especially during the Civil War. First synthesized independently by several scientists in 1831, chloroform’s anesthetic potential was not recognized until 1847 (History of Anesthesia). That year, Scottish obstetrician Dr. James Y. Simpson and two colleagues experimented with various inhaled substances in search of an alternative less irritating and less flammable than ether. After inhaling chloroform vapors from handkerchiefs in Simpson’s living room, the trio abruptly lost consciousness. Upon regaining their senses, they realized its extraordinary potential. Simpson quickly incorporated chloroform into his obstetrics practice, easing labor pain for countless women. By 1853 Queen Victoria even received chloroform during childbirth, solidifying its respectability (SOAP). Compared with ether, chloroform offered rapid onset, reduced airway irritation, and, crucially for military settings, far lower flammability. These qualities made it the preferred anesthetic of Civil War surgeons working in field hospitals vulnerable to fire and explosion hazards (Dalton, 2020).
Although both ether and chloroform had been adopted enthusiastically in the United States and abroad, the Civil War was the first large-scale conflict to apply them systematically, thereby transforming anesthesia into the standard of surgical care. The unprecedented volume of operations, particularly amputations, allowed surgeons to refine their dosing strategies and gain a better understanding of anesthetic physiology. Chloroform was most often administered using a cone-shaped sponge or a rigid cone lined with a chloroform-soaked sponge. When held over the patient’s nose and mouth, unconsciousness was typically achieved within about nine minutes (Reimer, 2017). With careful administration of additional chloroform in the same manner, unconsciousness could be maintained for fifteen to thirty minutes, long enough for the majority of battlefield procedures.
Amputation was by far the most common surgery requiring anesthesia, with over 60,000 performed during the war. Approximately 95% of all operations utilized ether or chloroform (Reimer, 2017). At this volume of use, complications inevitably arose. Some patients received excessive doses too rapidly and never regained consciousness. Chloroform depresses cardiac and respiratory function, and in rare cases this suppression became fatal. Contemporary estimates suggest a mortality rate of about 5.4 deaths per 1,000 anesthetic administrations, often in patients already gravely wounded or otherwise unstable (Devine, 2016). Civil War physicians learned from these rare complications and grew adept at anesthetic delivery. Their collective experience helped establish anesthesiology as a discrete medical discipline in the postwar years.
The timing of anesthesia’s discovery was extraordinarily fortunate. It spared thousands of soldiers the agony of surgery and allowed surgeons to operate with greater precision, producing improved outcomes. Chloroform was even administered during the amputation of General “Stonewall” Jackson’s arm. As he drifted into unconsciousness, he is reported to have remarked, “What an infinite blessing” (“Battle of Chancellorsville”). Though Jackson later died from his injuries, many others survived due to the skilled use of chloroform, which became known among troops as a “soldier’s best friend during a painful surgery.”
Despite chloroform’s ubiquity, the image of Civil War soldiers “biting the bullet” during amputations persists in popular memory. This enduring myth likely reflects earlier practices, when patients might bite on leather straps or wood to endure pain before anesthesia became widespread. However, no reliable records exist of Civil War soldiers using bullets in this way, and physicians most likely would not have permitted such a dangerous choking hazard. Discoveries of bitten bullet casings are better explained by the gnawing of pigs on abandoned battlefield debris (Reimer, 2017). Combined with the enormous number of amputations, these artifacts gave rise to a legend of stoic heroism that has since overshadowed the historical reality that chloroform, not grit, spared soldiers from agony.
Ultimately, anesthesia proved to be a transformative medical innovation that arrived just in time for the Civil War. It enabled surgeons to perform complex procedures with greater accuracy while drastically reducing patient suffering. Its extensive wartime use generated a wealth of clinical experience, demonstrating that anesthesia was both safe and effective when administered properly. The Civil War thus solidified anesthesia as an indispensable component of surgical practice and accelerated the growth of anesthesiology as a specialty, laying the groundwork for the discovery of many more sophisticated pharmacologic agents in the decades that followed.
The Birth of Modern Medicine: Causalgia and Hospital Gangrene in the Civil War
- Cristen Huynh
The American Civil War was a period of immense military and social upheaval, but it also served as an unexpected catalyst for significant medical advancements. With an unprecedented number of casualties, including 620,000 deaths and 860,000 wounded soldiers, the war provided a grim laboratory for physicians to observe, document, and categorize a new generation of wounds and diseases (1). Although the Civil War era is often remembered for its primitive medical practices and outdated theories, the systematic study of causalgia and hospital gangrene marked a crucial intellectual shift. These two "proto-diagnoses" compelled physicians to move beyond the simple treatment of symptoms and to instead, investigate the underlying pathologies of disease and injury. This foundational work in recognizing complex neurological pain and the empirical link between sanitation and infection laid the groundwork for the modern fields of neurology and public health.
The first of these pivotal discoveries was causalgia, a term coined by the Philadelphia physician Silas Weir Mitchell. His work at Turner's Lane Hospital in Philadelphia studied the devastating effects of the minie ball. Unlike the ammunition of earlier conflicts, the advanced design of the soft lead minie ball resulted in a greater fragmentation and energy transfer upon impact. This often caused extensive bone pulverization and soft tissue damage, leading to complex and previously unseen nerve injuries (2). Mitchell observed a distinct condition characterized by soldiers suffering from an excruciating, disproportionate, burning pain that was out of relation to the initial wound and was often triggered by the slightest touch, change in temperature, or sudden noise. He described this sensation as "the most terrible of all the tortures which a nerve wound may inflict" (2,3). Mitchell’s documentation of these symptoms and the patients’ psychological distress marked a departure from the prevailing medical wisdom, which lacked a framework for understanding complex pain phenomena. The burning pain characterizing causalgia was not secondary to inflammation or wound infection, but rather a neurological disorder caused by damage to peripheral nerves (4). Mitchell’s 1864 publication, "Gunshot Wounds and Other Injuries of the Nerves," co-authored with George R. Morehouse and William W. Keen, stands as a seminal work in the history of neurology. It provided the first comprehensive description of neuropathic pain and postulated a new cause-and-effect relationship in the nervous system. The detailed case studies and careful clinical observations in this work provided the bedrock for the modern field of neurology. Today, Mitchell's causalgia is recognized as Complex Regional Pain Syndrome Type II, a chronic neuropathic condition characterized by the interplay of peripheral and central nervous system abnormalities. This modern diagnosis recognizes the disorder as more than just pain, encompassing symptoms like hyperalgesia and autonomic disturbances, with a multifactorial pathophysiology that continues to challenge physicians (5). Ultimately, the Civil War provided the necessary clinical volume for Mitchell to identify a novel syndrome. His publications on causalgia became a cornerstone of modern neurology, creating a new field of inquiry dedicated to understanding the intricate mechanisms of neuropathic pain.
The second medical challenge, and perhaps the more horrifying, was hospital gangrene. This lethal and highly contagious bacterial infection was a source of widespread devastation in the crowded, unsanitary field hospitals of the Civil War. Surgeons of the time were unaware of germ theory and operated in environments that were breeding grounds for pathogens. Instruments were often used on multiple patients without sterilization and wound dressings were rarely changed, creating a vicious cycle of infection1. Hospital gangrene was characterized by rapidly progressing necrosis, a distinct foul odor, and a mortality rate of up to 60%. This swift and devastating course presented a significant and often insurmountable clinical challenge for physicians of the era (6, 7). A minor wound could rapidly escalate into a fatal infection due to hospital gangrene's status as a form of necrotizing soft-tissue infection. From this crisis emerged the initial empirical steps toward modern infection control. While physicians of the era lacked the concept of microscopic bacteria, they were keen observers of patterns. They noted that hospital gangrene spread from patient to patient, suggesting a contagious element. American Army Surgeon Middleton Goldsmith began experimenting with aggressive treatments. He observed that the application of chemical agents, like nitric acid and bromine, directly to the infected wounds appeared to halt the infection's spread. His use of bromine was particularly effective, dropping the mortality rate for his patients to less than 3%. Although Goldsmith had no knowledge of germ theory, his empirical observations proved that a chemical agent could neutralize the causative agent of the infection8. The lessons learned from hospital gangrene were not rooted in a scientific understanding of its etiology, but in the systematic observation of its epidemiology and treatment. This period of trial-and-error set the stage for Joseph Lister's groundbreaking work on germ theory and antiseptic surgery in the years that followed. The Civil War provided the grim evidence that sanitation was a matter of life and death, serving as an essential precursor to the modern age of surgical hygiene and hospital-wide infection control.
In conclusion, the American Civil War represents a pivotal moment in medical history. The unprecedented scale of battlefield injuries provided a grim opportunity for physicians to observe and analyze conditions that had previously been poorly understood. Through the rigorous study of causalgia, Silas Weir Mitchell provided the first comprehensive description of neuropathic pain and initiated the intellectual foundation for the field of neurology. Simultaneously, the devastating and highly contagious nature of hospital gangrene compelled the medical community to recognize the undeniable link between sanitation and patient outcomes. These hard-won lessons, born from the battlefield and hospital wards, shifted the paradigm of medical practice and served as the precursors to modern antiseptic surgery, public health, and a more analytical approach to diagnosis.
References
1. Reilly RF. Medical and surgical care during the American Civil War, 1861-1865. Proc (Bayl Univ Med Cent). 2016 Apr;29(2):138-42. doi: 10.1080/08998280.2016.11929390. PMID: 27034545; PMCID: PMC4790547.
2. 2. Mitchell, S. W., Morehouse, G. R., & Keen, W. W. (1864). Gunshot Wounds and Other Injuries of Nerves. J. B. Lippincott & Co.
3. Mitchell, S. W. (1872). Injuries of Nerves and Their Consequences. Lippincott.
4. Lau FH, Chung KC. Silas Weir Mitchell, MD: the physician who discovered causalgia. J Hand Surg Am. 2004 Mar;29(2):181-7. doi: 10.1016/j.jhsa.2003.08.016. PMID: 15043886.
5. Guthmiller KB, Dua A, Dey S, et al. Complex Regional Pain Syndrome. [Updated 2025 May 4]. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2025 Jan.
6. Clipson, R. (2023). Hospital Gangrene in the Civil War. National Museum of Civil War Medicine. https://www.civilwarmed.org/hospital-gangrene-in-the-civil-war/
7. Bosshardt TL, Henderson VJ, Organ CH Jr. Necrotizing soft-tissue infections. In: Holzheimer RG, Mannick JA, editors. Surgical Treatment: Evidence-Based and Problem-Oriented. Munich: Zuckschwerdt; 2001.
8. Trombold JM. Gangrene therapy and antisepsis before lister: the civil war contributions of Middleton Goldsmith of Louisville. Am Surg. 2011 Sep;77(9):1138-43. PMID: 21944621.
The American Civil War was a period of immense military and social upheaval, but it also served as an unexpected catalyst for significant medical advancements. With an unprecedented number of casualties, including 620,000 deaths and 860,000 wounded soldiers, the war provided a grim laboratory for physicians to observe, document, and categorize a new generation of wounds and diseases (1). Although the Civil War era is often remembered for its primitive medical practices and outdated theories, the systematic study of causalgia and hospital gangrene marked a crucial intellectual shift. These two "proto-diagnoses" compelled physicians to move beyond the simple treatment of symptoms and to instead, investigate the underlying pathologies of disease and injury. This foundational work in recognizing complex neurological pain and the empirical link between sanitation and infection laid the groundwork for the modern fields of neurology and public health.
The first of these pivotal discoveries was causalgia, a term coined by the Philadelphia physician Silas Weir Mitchell. His work at Turner's Lane Hospital in Philadelphia studied the devastating effects of the minie ball. Unlike the ammunition of earlier conflicts, the advanced design of the soft lead minie ball resulted in a greater fragmentation and energy transfer upon impact. This often caused extensive bone pulverization and soft tissue damage, leading to complex and previously unseen nerve injuries (2). Mitchell observed a distinct condition characterized by soldiers suffering from an excruciating, disproportionate, burning pain that was out of relation to the initial wound and was often triggered by the slightest touch, change in temperature, or sudden noise. He described this sensation as "the most terrible of all the tortures which a nerve wound may inflict" (2,3). Mitchell’s documentation of these symptoms and the patients’ psychological distress marked a departure from the prevailing medical wisdom, which lacked a framework for understanding complex pain phenomena. The burning pain characterizing causalgia was not secondary to inflammation or wound infection, but rather a neurological disorder caused by damage to peripheral nerves (4). Mitchell’s 1864 publication, "Gunshot Wounds and Other Injuries of the Nerves," co-authored with George R. Morehouse and William W. Keen, stands as a seminal work in the history of neurology. It provided the first comprehensive description of neuropathic pain and postulated a new cause-and-effect relationship in the nervous system. The detailed case studies and careful clinical observations in this work provided the bedrock for the modern field of neurology. Today, Mitchell's causalgia is recognized as Complex Regional Pain Syndrome Type II, a chronic neuropathic condition characterized by the interplay of peripheral and central nervous system abnormalities. This modern diagnosis recognizes the disorder as more than just pain, encompassing symptoms like hyperalgesia and autonomic disturbances, with a multifactorial pathophysiology that continues to challenge physicians (5). Ultimately, the Civil War provided the necessary clinical volume for Mitchell to identify a novel syndrome. His publications on causalgia became a cornerstone of modern neurology, creating a new field of inquiry dedicated to understanding the intricate mechanisms of neuropathic pain.
The second medical challenge, and perhaps the more horrifying, was hospital gangrene. This lethal and highly contagious bacterial infection was a source of widespread devastation in the crowded, unsanitary field hospitals of the Civil War. Surgeons of the time were unaware of germ theory and operated in environments that were breeding grounds for pathogens. Instruments were often used on multiple patients without sterilization and wound dressings were rarely changed, creating a vicious cycle of infection1. Hospital gangrene was characterized by rapidly progressing necrosis, a distinct foul odor, and a mortality rate of up to 60%. This swift and devastating course presented a significant and often insurmountable clinical challenge for physicians of the era (6, 7). A minor wound could rapidly escalate into a fatal infection due to hospital gangrene's status as a form of necrotizing soft-tissue infection. From this crisis emerged the initial empirical steps toward modern infection control. While physicians of the era lacked the concept of microscopic bacteria, they were keen observers of patterns. They noted that hospital gangrene spread from patient to patient, suggesting a contagious element. American Army Surgeon Middleton Goldsmith began experimenting with aggressive treatments. He observed that the application of chemical agents, like nitric acid and bromine, directly to the infected wounds appeared to halt the infection's spread. His use of bromine was particularly effective, dropping the mortality rate for his patients to less than 3%. Although Goldsmith had no knowledge of germ theory, his empirical observations proved that a chemical agent could neutralize the causative agent of the infection8. The lessons learned from hospital gangrene were not rooted in a scientific understanding of its etiology, but in the systematic observation of its epidemiology and treatment. This period of trial-and-error set the stage for Joseph Lister's groundbreaking work on germ theory and antiseptic surgery in the years that followed. The Civil War provided the grim evidence that sanitation was a matter of life and death, serving as an essential precursor to the modern age of surgical hygiene and hospital-wide infection control.
In conclusion, the American Civil War represents a pivotal moment in medical history. The unprecedented scale of battlefield injuries provided a grim opportunity for physicians to observe and analyze conditions that had previously been poorly understood. Through the rigorous study of causalgia, Silas Weir Mitchell provided the first comprehensive description of neuropathic pain and initiated the intellectual foundation for the field of neurology. Simultaneously, the devastating and highly contagious nature of hospital gangrene compelled the medical community to recognize the undeniable link between sanitation and patient outcomes. These hard-won lessons, born from the battlefield and hospital wards, shifted the paradigm of medical practice and served as the precursors to modern antiseptic surgery, public health, and a more analytical approach to diagnosis.
References
1. Reilly RF. Medical and surgical care during the American Civil War, 1861-1865. Proc (Bayl Univ Med Cent). 2016 Apr;29(2):138-42. doi: 10.1080/08998280.2016.11929390. PMID: 27034545; PMCID: PMC4790547.
2. 2. Mitchell, S. W., Morehouse, G. R., & Keen, W. W. (1864). Gunshot Wounds and Other Injuries of Nerves. J. B. Lippincott & Co.
3. Mitchell, S. W. (1872). Injuries of Nerves and Their Consequences. Lippincott.
4. Lau FH, Chung KC. Silas Weir Mitchell, MD: the physician who discovered causalgia. J Hand Surg Am. 2004 Mar;29(2):181-7. doi: 10.1016/j.jhsa.2003.08.016. PMID: 15043886.
5. Guthmiller KB, Dua A, Dey S, et al. Complex Regional Pain Syndrome. [Updated 2025 May 4]. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2025 Jan.
6. Clipson, R. (2023). Hospital Gangrene in the Civil War. National Museum of Civil War Medicine. https://www.civilwarmed.org/hospital-gangrene-in-the-civil-war/
7. Bosshardt TL, Henderson VJ, Organ CH Jr. Necrotizing soft-tissue infections. In: Holzheimer RG, Mannick JA, editors. Surgical Treatment: Evidence-Based and Problem-Oriented. Munich: Zuckschwerdt; 2001.
8. Trombold JM. Gangrene therapy and antisepsis before lister: the civil war contributions of Middleton Goldsmith of Louisville. Am Surg. 2011 Sep;77(9):1138-43. PMID: 21944621.
Pathological Draft Resistance in the Civil War: Self-Inflicted Medical Exemptions
- Mitch Ford
During the Civil War, desperation to avoid military service drove some draftees to extreme lengths. One example was the self-extraction of teeth- men would literally pull out their own healthy front teeth (or hire willing dentists to do so) in hopes of failing the army’s dental requirements.[1] This form of draft evasion reflected a broader pattern of men injuring themselves or feigning illness to secure medical disqualifications. These acts not only carried risks of infection, hemorrhage, and long-term disability but also forced military authorities to adapt. Army surgeons grew wise to these ploys, tightening examination standards and even pursuing those seeking evasion. This essay examines the topic of medically-motivated draft dodging in the Civil War, beginning with the case of pulled teeth and expanding to the wider context of self-harm and feigned illness as means to avoid service.
Both the Union and Confederacy relied on the draft as the war progressed. From the start, medical standards determined eligibility. Among the most famous requirements was possession of enough teeth to bite open powder cartridges for rifles. Union regulations stated that loss of incisors and canines from both jaws disqualified a recruit, since he needed “a sufficient number of teeth in good condition…to tear his cartridge quickly and with ease.” The ability to chew hard crackers and salt pork for nutrition was equally important, as was dental health as a marker of overall fitness.[2]
During the Union draft of 1863, over 5,200 men, about 2.4 percent of those examined, were exempted for dental reasons. Examiners even had an abbreviation for men rejected due to teeth: “4-F,” initially shorthand for lack of four front teeth. At first, losing teeth for any reason could bar a man from service, but standards tightened as reports emerged of deliberate extractions. Surgeon Robert Bartholow’s 1864 manual clarified that missing incisors were not grounds for exemption unless loss was clearly due to disease.[3] As opposed to the Union, Confederate practice was more lenient. Regulations said little about teeth, and as personnel shortages grew worse, even men with grave disabilities were kept in the ranks. One particular Confederate soldier had lost most of his teeth and many parts of his jawbone from mercury treatment. He would have been discharged by Union examiners because he was unable to chew solid food, yet he remained in service until his death in 1864.[4]
Missing teeth were only one part of a wider trend. Once the draft began, many men plotted how to fail the physical. Surgeons noted a surge of “accidents” such as missing fingers and inflamed eyes. Some draftees limped on an old injury, inflated heart rates by running before an exam, or tied ropes to swell veins and mimic circulatory disease. It was common to feign deafness and blindness- doctors devised tests such as speaking softly or tricking eye movements to catch fraud. Some draftees rubbed cayenne or lye into skin or eyes to create rashes and swelling. One Minnesota draftee soaked his feet in lye for ten days to secure exemption, only to be arrested when his feet “miraculously healed.” Lastly, and probably most notably, others cut off trigger fingers or toes.[5] Newspapers often reported on such acts, condemning the men as cowards.[6]

W. E. S. Trowbridge, Candidates from the Exempt Brigade
Union officials quickly caught on. After the Enrollment Act of 1863, local draft boards with examining surgeons began sharing ways to spot men faking illness or injury. They devised trick tests and looked for fresh injuries. They would test claimed deafness or blindness by observing involuntary reactions, and would note when a supposedly injured man forgot to limp under distraction.[7] Exemptions for unhealed gums started to be denied. Bartholow emphasized in his manual that sudden loss of healthy teeth did not qualify.[8] Some officers even suggested drafting tooth-pullers anyway to become the laughing stock of soldiers.[9] Follow-up exams further exposed fraud. In the North, men who had been excused were sometimes called back weeks later, and if their problems had disappeared, they were arrested. These jailed offenders were often used as publicized cases to deter others.[10]
Deliberate self-injury to evade the draft caused outrage. Newspapers blasted “coward hordes” who cut off extremities or fled abroad. The Chattanooga Daily Gazette in 1864 mocked men who paid surgeons to remove teeth, suggesting they be drafted anyway to face ridicule.[11] The Cleveland Herald in 1862 condemned a man who chopped off his toes, arguing he had forfeited manhood and should be cast from society.[12] These reactions show how closely patriotism, masculinity, and military service were tied together during the Civil War era. To many, self-mutilation looked like cowardice and selfishness, but the reality was likely a little more complicated. Many soldiers were poor and fearful of death in a war that had already killed thousands. Drafted into a war against their will, these men could have reached a breaking point where the certainty of personal injury seemed preferable to the roulette of combat.
Men pulling their own teeth to escape Civil War service illustrates the tension between self-preservation and duty. From tooth extractions to faked illnesses and mutilations, draft dodgers forced examiners into a back-and-forth contest of deception and detection. Ultimately, few succeeded in escaping service this way, but the symbolism remained important. To their contemporaries, the man who maimed himself to dodge combat was a coward. For historians, these acts reveal the psychological toll of the draft and the extraordinary lengths men would go to avoid war. They remind us that alongside tales of heroism, the Civil War also produced scars of desperation- sometimes self-inflicted.
Bibliography
Chattanooga Daily Gazette (Chattanooga, TN). “Self-Mutilation to Escape the Draft.” August 9, 1864. Quoted in Aileen E. McTiernan, “Suicide and Self-Mutilation.” Gale Library of Daily Life: Slavery in America. Gale, 2008.
Daily Cleveland Herald (Cleveland, OH). “Self-Mutilation.” October 10, 1862. Quoted in Aileen E. McTiernan, “Suicide and Self-Mutilation.” Gale Library of Daily Life: Slavery in America. Gale, 2008.
Dalton, Kyle. “The Myth of Two Teeth.” The Medical Record. National Museum of Civil War Medicine. March 7, 2022. Accessed September 18, 2025. https://www.civilwarmed.org/two-teeth/
Marzoli, Nathan. “Fraud and Deception: Challenges for Enrollment Board Surgeons, 1863–1865.” The Medical Record. National Museum of Civil War Medicine. November 30, 2018. https://www.civilwarmed.org/enrollment-board-surgeons/
Riaud, Xavier. “A Dentist from the North Removed a Tooth in a Southman’s Mouth (American Civil War 1861–1865).” Journal of Otolaryngology – ENT Research 10, no. 6 (2018): 344.
Trowbridge, W. E. S. Candidates from the Exempt Brigade. Lithograph on wove paper, 1862. Library of Congress Prints and Photographs Division, Washington, D.C. https://www.loc.gov/pictures/item/2008661641/
During the Civil War, desperation to avoid military service drove some draftees to extreme lengths. One example was the self-extraction of teeth- men would literally pull out their own healthy front teeth (or hire willing dentists to do so) in hopes of failing the army’s dental requirements.[1] This form of draft evasion reflected a broader pattern of men injuring themselves or feigning illness to secure medical disqualifications. These acts not only carried risks of infection, hemorrhage, and long-term disability but also forced military authorities to adapt. Army surgeons grew wise to these ploys, tightening examination standards and even pursuing those seeking evasion. This essay examines the topic of medically-motivated draft dodging in the Civil War, beginning with the case of pulled teeth and expanding to the wider context of self-harm and feigned illness as means to avoid service.
Both the Union and Confederacy relied on the draft as the war progressed. From the start, medical standards determined eligibility. Among the most famous requirements was possession of enough teeth to bite open powder cartridges for rifles. Union regulations stated that loss of incisors and canines from both jaws disqualified a recruit, since he needed “a sufficient number of teeth in good condition…to tear his cartridge quickly and with ease.” The ability to chew hard crackers and salt pork for nutrition was equally important, as was dental health as a marker of overall fitness.[2]
During the Union draft of 1863, over 5,200 men, about 2.4 percent of those examined, were exempted for dental reasons. Examiners even had an abbreviation for men rejected due to teeth: “4-F,” initially shorthand for lack of four front teeth. At first, losing teeth for any reason could bar a man from service, but standards tightened as reports emerged of deliberate extractions. Surgeon Robert Bartholow’s 1864 manual clarified that missing incisors were not grounds for exemption unless loss was clearly due to disease.[3] As opposed to the Union, Confederate practice was more lenient. Regulations said little about teeth, and as personnel shortages grew worse, even men with grave disabilities were kept in the ranks. One particular Confederate soldier had lost most of his teeth and many parts of his jawbone from mercury treatment. He would have been discharged by Union examiners because he was unable to chew solid food, yet he remained in service until his death in 1864.[4]
Missing teeth were only one part of a wider trend. Once the draft began, many men plotted how to fail the physical. Surgeons noted a surge of “accidents” such as missing fingers and inflamed eyes. Some draftees limped on an old injury, inflated heart rates by running before an exam, or tied ropes to swell veins and mimic circulatory disease. It was common to feign deafness and blindness- doctors devised tests such as speaking softly or tricking eye movements to catch fraud. Some draftees rubbed cayenne or lye into skin or eyes to create rashes and swelling. One Minnesota draftee soaked his feet in lye for ten days to secure exemption, only to be arrested when his feet “miraculously healed.” Lastly, and probably most notably, others cut off trigger fingers or toes.[5] Newspapers often reported on such acts, condemning the men as cowards.[6]

W. E. S. Trowbridge, Candidates from the Exempt Brigade
Union officials quickly caught on. After the Enrollment Act of 1863, local draft boards with examining surgeons began sharing ways to spot men faking illness or injury. They devised trick tests and looked for fresh injuries. They would test claimed deafness or blindness by observing involuntary reactions, and would note when a supposedly injured man forgot to limp under distraction.[7] Exemptions for unhealed gums started to be denied. Bartholow emphasized in his manual that sudden loss of healthy teeth did not qualify.[8] Some officers even suggested drafting tooth-pullers anyway to become the laughing stock of soldiers.[9] Follow-up exams further exposed fraud. In the North, men who had been excused were sometimes called back weeks later, and if their problems had disappeared, they were arrested. These jailed offenders were often used as publicized cases to deter others.[10]
Deliberate self-injury to evade the draft caused outrage. Newspapers blasted “coward hordes” who cut off extremities or fled abroad. The Chattanooga Daily Gazette in 1864 mocked men who paid surgeons to remove teeth, suggesting they be drafted anyway to face ridicule.[11] The Cleveland Herald in 1862 condemned a man who chopped off his toes, arguing he had forfeited manhood and should be cast from society.[12] These reactions show how closely patriotism, masculinity, and military service were tied together during the Civil War era. To many, self-mutilation looked like cowardice and selfishness, but the reality was likely a little more complicated. Many soldiers were poor and fearful of death in a war that had already killed thousands. Drafted into a war against their will, these men could have reached a breaking point where the certainty of personal injury seemed preferable to the roulette of combat.
Men pulling their own teeth to escape Civil War service illustrates the tension between self-preservation and duty. From tooth extractions to faked illnesses and mutilations, draft dodgers forced examiners into a back-and-forth contest of deception and detection. Ultimately, few succeeded in escaping service this way, but the symbolism remained important. To their contemporaries, the man who maimed himself to dodge combat was a coward. For historians, these acts reveal the psychological toll of the draft and the extraordinary lengths men would go to avoid war. They remind us that alongside tales of heroism, the Civil War also produced scars of desperation- sometimes self-inflicted.
Bibliography
Chattanooga Daily Gazette (Chattanooga, TN). “Self-Mutilation to Escape the Draft.” August 9, 1864. Quoted in Aileen E. McTiernan, “Suicide and Self-Mutilation.” Gale Library of Daily Life: Slavery in America. Gale, 2008.
Daily Cleveland Herald (Cleveland, OH). “Self-Mutilation.” October 10, 1862. Quoted in Aileen E. McTiernan, “Suicide and Self-Mutilation.” Gale Library of Daily Life: Slavery in America. Gale, 2008.
Dalton, Kyle. “The Myth of Two Teeth.” The Medical Record. National Museum of Civil War Medicine. March 7, 2022. Accessed September 18, 2025. https://www.civilwarmed.org/two-teeth/
Marzoli, Nathan. “Fraud and Deception: Challenges for Enrollment Board Surgeons, 1863–1865.” The Medical Record. National Museum of Civil War Medicine. November 30, 2018. https://www.civilwarmed.org/enrollment-board-surgeons/
Riaud, Xavier. “A Dentist from the North Removed a Tooth in a Southman’s Mouth (American Civil War 1861–1865).” Journal of Otolaryngology – ENT Research 10, no. 6 (2018): 344.
Trowbridge, W. E. S. Candidates from the Exempt Brigade. Lithograph on wove paper, 1862. Library of Congress Prints and Photographs Division, Washington, D.C. https://www.loc.gov/pictures/item/2008661641/
Recognizing Invisible Pain in the Civil War
- Akayi Thein
The Civil War, as one of the deadliest wars in American History, resulted in unprecedented casualties at an immense scale. The staggering number of wounded soldiers presented physicians and surgeons with unfamiliar syndromes or conditions, creating an exceptional opportunity for systematic observation and documentation that laid the groundwork for modern diagnoses.
One of the most perplexing and widely observed conditions was irritable heart syndrome, also known as “Soldier's Heart,” a term coined by Dr. Jacob Mendes Da Costa, a civilian contract surgeon in Philadelphia (3). Between 1861 and 1865, Da Costa studied over 300 soldiers presenting with a constellation of baffling symptoms. These men complained of chest pain, a rapid and irregular pulse as well as difficulty breathing, fatigue, weakness, and a host of psychological symptoms, including nightmares and sleep disturbances. He noted that while not every patient experienced all symptoms, all cases of "irritable heart," as he initially called it, included a rapid pulse, palpitations, and chest pain, often following a digestive complaint (3).
Da Costa’s work was groundbreaking for its time, but his conclusions were limited by the era's medical framework. The concept of psychological trauma as a cause of physical ailment was largely nonexistent. While a few physicians of the time suggested the cardiac symptoms were "precipitated by battle trauma," a truly modern understanding of such a condition would not emerge for decades given that Sigmund Freud’s work on the subconscious was still years away. Therefore, Da Costa attributed the condition not to emotional distress, but to an over-taxing of the nervous system (3). He concluded that it was a physiological, rather than a psychological, disorder, and his work was widely published, establishing the condition as “Da Costa Syndrome” in the medical literature (1). Today, the symptoms so carefully documented by Da Costa are recognized as a classic presentation of Post-Traumatic Stress Disorder (PTSD) (3). This shift from viewing the condition as a purely nervous system disorder to a psychological one illustrates the dramatic evolution of medicine in the subsequent century.
Another significant medical discovery to emerge from the war was the phenomenon of phantom limb pain, a term also introduced by a Philadelphia physician, Silas Weir Mitchell, in 1871 (5). Although Mitchell is credited with coining the term and medicalizing the condition, the experience itself was well-known among the thousands of amputee soldiers and military surgeons. However, many physicians at the time dismissed it as soldiers feigning illness because there was no visible physical cause to justify the pain. This dismissal had devastating consequences for the soldiers, who were not only suffering from chronic, agonizing pain but were often stigmatized as malingerers feigning illness to avoid service (4).
Mitchell nonetheless believed that the soldiers’ pain was real and sought to legitimize it in the eyes of both the public and the medical community. He took an unusual approach by publishing a short story anonymously about a fictional amputee named George Dedlow who suffered from this mysterious pain (5). The story resonated powerfully with readers, and many, believing Dedlow to be a real person, even tried to contact the hospital to offer their support (4). This overwhelming public response demonstrated the reality of the soldiers’ suffering and helped Mitchell to fight the pervasive stigma. Following the story's success, Mitchell published clinical articles in medical journals, introducing the concept of pain in an amputated limb as a legitimate injury of the nerves and naming it “phantom limb pain”(2). Today, there is a much more robust understanding of this neurological phenomenon, and a variety of treatments, from physical therapy and medication to mirror therapy, have come about.
The U.S. Civil War, while a period of immense devastation, inadvertently became a pivotal moment for American medicine. The dedicated work of doctors like Da Costa and Mitchell, who meticulously documented the baffling conditions they encountered, led to a deeper understanding of the complex relationship between the mind and body. While their initial conclusions may have been flawed by the limitations of their time, their systematic efforts provided the essential groundwork for modern diagnoses like PTSD and a more compassionate, evidence-based approach to chronic pain. The legacy of these two conditions underscores how an empathetic approach to human suffering can change the course of medical history.
References
Da Costa JM. On irritable heart: a clinical study of some functional cardiac disorders and their diagnosis. Am J Med Sci. 1871;61(1):17-52.
Mitchell SW. Phantom limbs and the ghosts of war. Harper’s New Monthly Magazine. 1871;43(257):697-703.
National Museum of Civil War Medicine. Irritable heart: fictional Dr. Foster and actual Dr. Jacob M. Da Costa. Civil War Medicine. Published January 20, 2016. Accessed September 17, 2025. https://www.civilwarmed.org/irritableheart/.
University of Pennsylvania, Perelman School of Medicine, Department of Neurology. Silas Weir Mitchell (1829-1914). Penn Neurology. Accessed September 17, 2025. https://www.med.upenn.edu/neurology/silas-weir-mitchell.html.
Mitchell SW. The case of George Dedlow. The Atlantic. 1866;18(106):26-48.
The Civil War, as one of the deadliest wars in American History, resulted in unprecedented casualties at an immense scale. The staggering number of wounded soldiers presented physicians and surgeons with unfamiliar syndromes or conditions, creating an exceptional opportunity for systematic observation and documentation that laid the groundwork for modern diagnoses.
One of the most perplexing and widely observed conditions was irritable heart syndrome, also known as “Soldier's Heart,” a term coined by Dr. Jacob Mendes Da Costa, a civilian contract surgeon in Philadelphia (3). Between 1861 and 1865, Da Costa studied over 300 soldiers presenting with a constellation of baffling symptoms. These men complained of chest pain, a rapid and irregular pulse as well as difficulty breathing, fatigue, weakness, and a host of psychological symptoms, including nightmares and sleep disturbances. He noted that while not every patient experienced all symptoms, all cases of "irritable heart," as he initially called it, included a rapid pulse, palpitations, and chest pain, often following a digestive complaint (3).
Da Costa’s work was groundbreaking for its time, but his conclusions were limited by the era's medical framework. The concept of psychological trauma as a cause of physical ailment was largely nonexistent. While a few physicians of the time suggested the cardiac symptoms were "precipitated by battle trauma," a truly modern understanding of such a condition would not emerge for decades given that Sigmund Freud’s work on the subconscious was still years away. Therefore, Da Costa attributed the condition not to emotional distress, but to an over-taxing of the nervous system (3). He concluded that it was a physiological, rather than a psychological, disorder, and his work was widely published, establishing the condition as “Da Costa Syndrome” in the medical literature (1). Today, the symptoms so carefully documented by Da Costa are recognized as a classic presentation of Post-Traumatic Stress Disorder (PTSD) (3). This shift from viewing the condition as a purely nervous system disorder to a psychological one illustrates the dramatic evolution of medicine in the subsequent century.
Another significant medical discovery to emerge from the war was the phenomenon of phantom limb pain, a term also introduced by a Philadelphia physician, Silas Weir Mitchell, in 1871 (5). Although Mitchell is credited with coining the term and medicalizing the condition, the experience itself was well-known among the thousands of amputee soldiers and military surgeons. However, many physicians at the time dismissed it as soldiers feigning illness because there was no visible physical cause to justify the pain. This dismissal had devastating consequences for the soldiers, who were not only suffering from chronic, agonizing pain but were often stigmatized as malingerers feigning illness to avoid service (4).
Mitchell nonetheless believed that the soldiers’ pain was real and sought to legitimize it in the eyes of both the public and the medical community. He took an unusual approach by publishing a short story anonymously about a fictional amputee named George Dedlow who suffered from this mysterious pain (5). The story resonated powerfully with readers, and many, believing Dedlow to be a real person, even tried to contact the hospital to offer their support (4). This overwhelming public response demonstrated the reality of the soldiers’ suffering and helped Mitchell to fight the pervasive stigma. Following the story's success, Mitchell published clinical articles in medical journals, introducing the concept of pain in an amputated limb as a legitimate injury of the nerves and naming it “phantom limb pain”(2). Today, there is a much more robust understanding of this neurological phenomenon, and a variety of treatments, from physical therapy and medication to mirror therapy, have come about.
The U.S. Civil War, while a period of immense devastation, inadvertently became a pivotal moment for American medicine. The dedicated work of doctors like Da Costa and Mitchell, who meticulously documented the baffling conditions they encountered, led to a deeper understanding of the complex relationship between the mind and body. While their initial conclusions may have been flawed by the limitations of their time, their systematic efforts provided the essential groundwork for modern diagnoses like PTSD and a more compassionate, evidence-based approach to chronic pain. The legacy of these two conditions underscores how an empathetic approach to human suffering can change the course of medical history.
References
Da Costa JM. On irritable heart: a clinical study of some functional cardiac disorders and their diagnosis. Am J Med Sci. 1871;61(1):17-52.
Mitchell SW. Phantom limbs and the ghosts of war. Harper’s New Monthly Magazine. 1871;43(257):697-703.
National Museum of Civil War Medicine. Irritable heart: fictional Dr. Foster and actual Dr. Jacob M. Da Costa. Civil War Medicine. Published January 20, 2016. Accessed September 17, 2025. https://www.civilwarmed.org/irritableheart/.
University of Pennsylvania, Perelman School of Medicine, Department of Neurology. Silas Weir Mitchell (1829-1914). Penn Neurology. Accessed September 17, 2025. https://www.med.upenn.edu/neurology/silas-weir-mitchell.html.
Mitchell SW. The case of George Dedlow. The Atlantic. 1866;18(106):26-48.
Orthopedic Surgery in the Civil War and Present Day
- Joshua Longmire
The Civil War led to numerous advancements in the field of orthopedics. Through extensive trial and error, and with limited understanding of germ theory, Civil War surgeons developed ideas and practices that still have a significant impact on how patients are cared for today. During the war, orthopedic treatment mainly consisted of conservative surgery, excisional surgery, or amputations. However, amputations were the most common method of treatment, with about 50% of extremity gunshot wounds that included fractures being treated with amputations, to best avoid further injury or death (Kuz). When it came to amputations, surgeons experimented with many different iterations; the anteroposterior flaps were found to be more useful than the circular or guillotine methods for retaining the most function while being the most effective (Kuz). As the war went on and surgeons gained more experience, they leaned more toward conservative surgeries and debridement rather than full amputation, unless secondary amputation was then deemed necessary. The most common conservative procedures were excision of the shoulder, elbow, and hip, with the hip being by far the least successful. After these excision surgeries, soldiers were commonly put into a splint and had passive motion started as soon as possible (Kuz).
Another advancement came with traction devices and splints, specifically Buck’s traction, which is used today with hip fractures, and plaster splints. Additionally, the first attempts at internal fixation of fractures were made during the Civil War. With only a couple of documented attempts, the main surgeon for this procedure was Dr. Benjamin Howard. He would “resect the comminuted ends of the fracture site, place the ends in opposition, and use a special drill and suture passer to hold the bone ends together using a wire” (Kuz). A development in orthopedic surgery and most surgeries in general during the Civil War was the best time after an injury to do surgery. While it was greatly debated amongst surgeons, the consensus seemed to be that primary surgery should be done within 48 hours of injury and secondary surgery should be done after 30 days. The time in between, the intermediate period, resulted in the most surgical deaths. An unlikely advancement in osteomyelitis and gangrene came when Confederate surgeons discovered that wounds with maggots became cured at much higher rates (Kuz). This became the primary form of treatment, with maggots being bred under sterile conditions.
With all the injuries and surgeries going on during the Civil War, there became a need for prosthetics. The Association for the Relief of Maimed Soldiers became the primary provider of prosthetics to soldiers with amputations, giving 769 prosthetics in total. The use of prosthetics also required that surgeons perform their amputations no more proximally than the distal ⅓ of the tibia and that they use anteroposterior flap amputations for the best outcomes. The more prosthetics that were made and the longer the war progressed, advancements in prosthetics included, “a single-axis ankle controlled by vulcanized rubber bumpers and a transfemoral prosthesis with a polycentric roller knee, multiarticulated foot, and endoskeletal construction” (Kuz).
The specialty of orthopedic surgery grew tremendously during the Civil War and shortly afterwards. The first orthopedics textbook is through to have been written about soldiers during the war by Louis Baur, the first orthopedic professorship was created in 1861 by Lewis Sayre, the first orthopedic residency was started in 1863 under James Knight, orthopedics was recognized as an especially by the first major hospital in 1872, and the American Orthopaedic Association was formed in 1887 as the first US orthopedic organization (Kuz). Orthopedic surgery started and grew from the heavy case load the surgeons were thrown into during the war, with many surgeons of the war using what they learned to treat patients and veterans after the war.
Looking at orthopedic surgery’s advancement and procedures today shows just how far the specialty and care for patients have come from what we saw in the beginning with the Civil War. Amputations during the Civil War resulted in a 26.3% mortality rate (Reilly). This is compared to current orthopedic surgery, where an above-the-knee amputation anywhere on the femur has the highest mortality rate at 7.22% (Ernst et al). What once was just a field solely defined by wound and trauma care has now grown into a specialty focused on many facets: preventative medicine, diagnosis and treatment of disorders and illnesses, and providing treatment, therapy, and recovery for musculoskeletal problems. Orthopedic surgeons now use a variety of tests, like X-rays, CT scans, MRIs, blood work, and physical exams, to be able to better determine the patient’s problem and best course of treatment (Liang et al). Once surgeons have their diagnosis, they now have a multitude of techniques and procedures they can use to best treat the patient. Orthopedic surgery uses arthroscopy, a surgical technique that is minimally invasive with the use of a camera, which allows surgeons to access and treat conditions in numerous joints with only a few small incisions. Another aspect of orthopedic surgery is joint replacement. Surgeons can remove diseased or injured joints and put in manufactured replacements to improve the use of the affected joints and the lives of their patients tremendously (Liang et al). Instead of simply amputating a limb or removing parts of joints like in the Civil War, surgeons now have the techniques and equipment to fix and replace affected body parts at a much higher frequency.
An area of orthopedic surgery that began during the Civil War was fracture repair. In addition to casting, external fixation, and internal fixation, which were done during the Civil War, surgeons can now do bone grafts and bone stimulation to replace bone and promote bone healing. Stemming from splints and traction devices in the Civil War, orthopedic surgeons today have many techniques for external fixation. Some of these techniques include the Ilizarov Technique, a set of wires and pins connected to a circular frame around the affected body part that corrects length discrepancies, and the Taylor Spatial Frame, a frame with struts and pins attached to the affected body part to also correct leg lengths or deformities (Liang et al). Orthopedic surgeons can also perform osteotomy procedures to help reshape and cut bones in patients that have certain conditions or deformities, as well as use bone fusion to relieve the pain and damage in joints to stabilize the patient. The field of orthopedic surgery has a much more expanded focus on soft tissue now, more than it did during the Civil War. Surgeons can use grafting techniques, percutaneous repair, arthroscopic repair, and open surgery to treat many other problems not related to bones (Liang et al).
Finally, orthopedic surgery now has aspects that could not have been thought of during the Civil War. The specialty is advancing toward the use of robotics in surgery, 3D printing of devices and implants, regenerative medicine, virtual reality, telemedicine, and artificial intelligence (Liang et al). Orthopedic surgery today can impact a much wider range of problems and offers more individualized diagnosis and treatments, many of which were developed and enhanced from the Civil War. Techniques like trauma management, amputations, Buck’s traction and plaster splints, open treatment of contaminated wounds, internal and external fixation, and resection and extraction of bone and fragments all stemmed from the Civil War and have led to the highly sophisticated orthopedic surgery field that we currently have today (Kuz).
References
1. Ernst, B. S., Kiritsis, N. R., Wyatt, P. B., Reiter, C. R., O’Neill, C. N., Satalich, J. R., & Vap, A. R. (2025). Ranking the Orthopedic Procedures With the Highest Morbidity and Mortality. Orthopedics, 48(1), e40–e44. https://doi.org/10.3928/01477447-20240913-02 (Original work published January 1, 2025)
2. Kuz J. E. (2004). The ABJS presidential lecture, June 2004: our orthopaedic heritage: the American Civil War. Clinical orthopaedics and related research, (429), 306–315.
3. Liang, W., Zhou, C., Bai, J., Zhang, H., Jiang, B., Wang, J., Fu, L., Long, H., Huang, X., Zhao, J., & Zhu, H. (2024). Current advancements in therapeutic approaches in orthopedic surgery: a review of recent trends. Frontiers in bioengineering and biotechnology, 12, 1328997. https://doi.org/10.3389/fbioe.2024.1328997
4. Reilly R. F. (2016). Medical and surgical care during the American Civil War, 1861-1865. Proceedings (Baylor University. Medical Center), 29(2), 138–142. https://doi.org/10.1080/08998280.2016.11929390
The Civil War led to numerous advancements in the field of orthopedics. Through extensive trial and error, and with limited understanding of germ theory, Civil War surgeons developed ideas and practices that still have a significant impact on how patients are cared for today. During the war, orthopedic treatment mainly consisted of conservative surgery, excisional surgery, or amputations. However, amputations were the most common method of treatment, with about 50% of extremity gunshot wounds that included fractures being treated with amputations, to best avoid further injury or death (Kuz). When it came to amputations, surgeons experimented with many different iterations; the anteroposterior flaps were found to be more useful than the circular or guillotine methods for retaining the most function while being the most effective (Kuz). As the war went on and surgeons gained more experience, they leaned more toward conservative surgeries and debridement rather than full amputation, unless secondary amputation was then deemed necessary. The most common conservative procedures were excision of the shoulder, elbow, and hip, with the hip being by far the least successful. After these excision surgeries, soldiers were commonly put into a splint and had passive motion started as soon as possible (Kuz).
Another advancement came with traction devices and splints, specifically Buck’s traction, which is used today with hip fractures, and plaster splints. Additionally, the first attempts at internal fixation of fractures were made during the Civil War. With only a couple of documented attempts, the main surgeon for this procedure was Dr. Benjamin Howard. He would “resect the comminuted ends of the fracture site, place the ends in opposition, and use a special drill and suture passer to hold the bone ends together using a wire” (Kuz). A development in orthopedic surgery and most surgeries in general during the Civil War was the best time after an injury to do surgery. While it was greatly debated amongst surgeons, the consensus seemed to be that primary surgery should be done within 48 hours of injury and secondary surgery should be done after 30 days. The time in between, the intermediate period, resulted in the most surgical deaths. An unlikely advancement in osteomyelitis and gangrene came when Confederate surgeons discovered that wounds with maggots became cured at much higher rates (Kuz). This became the primary form of treatment, with maggots being bred under sterile conditions.
With all the injuries and surgeries going on during the Civil War, there became a need for prosthetics. The Association for the Relief of Maimed Soldiers became the primary provider of prosthetics to soldiers with amputations, giving 769 prosthetics in total. The use of prosthetics also required that surgeons perform their amputations no more proximally than the distal ⅓ of the tibia and that they use anteroposterior flap amputations for the best outcomes. The more prosthetics that were made and the longer the war progressed, advancements in prosthetics included, “a single-axis ankle controlled by vulcanized rubber bumpers and a transfemoral prosthesis with a polycentric roller knee, multiarticulated foot, and endoskeletal construction” (Kuz).
The specialty of orthopedic surgery grew tremendously during the Civil War and shortly afterwards. The first orthopedics textbook is through to have been written about soldiers during the war by Louis Baur, the first orthopedic professorship was created in 1861 by Lewis Sayre, the first orthopedic residency was started in 1863 under James Knight, orthopedics was recognized as an especially by the first major hospital in 1872, and the American Orthopaedic Association was formed in 1887 as the first US orthopedic organization (Kuz). Orthopedic surgery started and grew from the heavy case load the surgeons were thrown into during the war, with many surgeons of the war using what they learned to treat patients and veterans after the war.
Looking at orthopedic surgery’s advancement and procedures today shows just how far the specialty and care for patients have come from what we saw in the beginning with the Civil War. Amputations during the Civil War resulted in a 26.3% mortality rate (Reilly). This is compared to current orthopedic surgery, where an above-the-knee amputation anywhere on the femur has the highest mortality rate at 7.22% (Ernst et al). What once was just a field solely defined by wound and trauma care has now grown into a specialty focused on many facets: preventative medicine, diagnosis and treatment of disorders and illnesses, and providing treatment, therapy, and recovery for musculoskeletal problems. Orthopedic surgeons now use a variety of tests, like X-rays, CT scans, MRIs, blood work, and physical exams, to be able to better determine the patient’s problem and best course of treatment (Liang et al). Once surgeons have their diagnosis, they now have a multitude of techniques and procedures they can use to best treat the patient. Orthopedic surgery uses arthroscopy, a surgical technique that is minimally invasive with the use of a camera, which allows surgeons to access and treat conditions in numerous joints with only a few small incisions. Another aspect of orthopedic surgery is joint replacement. Surgeons can remove diseased or injured joints and put in manufactured replacements to improve the use of the affected joints and the lives of their patients tremendously (Liang et al). Instead of simply amputating a limb or removing parts of joints like in the Civil War, surgeons now have the techniques and equipment to fix and replace affected body parts at a much higher frequency.
An area of orthopedic surgery that began during the Civil War was fracture repair. In addition to casting, external fixation, and internal fixation, which were done during the Civil War, surgeons can now do bone grafts and bone stimulation to replace bone and promote bone healing. Stemming from splints and traction devices in the Civil War, orthopedic surgeons today have many techniques for external fixation. Some of these techniques include the Ilizarov Technique, a set of wires and pins connected to a circular frame around the affected body part that corrects length discrepancies, and the Taylor Spatial Frame, a frame with struts and pins attached to the affected body part to also correct leg lengths or deformities (Liang et al). Orthopedic surgeons can also perform osteotomy procedures to help reshape and cut bones in patients that have certain conditions or deformities, as well as use bone fusion to relieve the pain and damage in joints to stabilize the patient. The field of orthopedic surgery has a much more expanded focus on soft tissue now, more than it did during the Civil War. Surgeons can use grafting techniques, percutaneous repair, arthroscopic repair, and open surgery to treat many other problems not related to bones (Liang et al).
Finally, orthopedic surgery now has aspects that could not have been thought of during the Civil War. The specialty is advancing toward the use of robotics in surgery, 3D printing of devices and implants, regenerative medicine, virtual reality, telemedicine, and artificial intelligence (Liang et al). Orthopedic surgery today can impact a much wider range of problems and offers more individualized diagnosis and treatments, many of which were developed and enhanced from the Civil War. Techniques like trauma management, amputations, Buck’s traction and plaster splints, open treatment of contaminated wounds, internal and external fixation, and resection and extraction of bone and fragments all stemmed from the Civil War and have led to the highly sophisticated orthopedic surgery field that we currently have today (Kuz).
References
1. Ernst, B. S., Kiritsis, N. R., Wyatt, P. B., Reiter, C. R., O’Neill, C. N., Satalich, J. R., & Vap, A. R. (2025). Ranking the Orthopedic Procedures With the Highest Morbidity and Mortality. Orthopedics, 48(1), e40–e44. https://doi.org/10.3928/01477447-20240913-02 (Original work published January 1, 2025)
2. Kuz J. E. (2004). The ABJS presidential lecture, June 2004: our orthopaedic heritage: the American Civil War. Clinical orthopaedics and related research, (429), 306–315.
3. Liang, W., Zhou, C., Bai, J., Zhang, H., Jiang, B., Wang, J., Fu, L., Long, H., Huang, X., Zhao, J., & Zhu, H. (2024). Current advancements in therapeutic approaches in orthopedic surgery: a review of recent trends. Frontiers in bioengineering and biotechnology, 12, 1328997. https://doi.org/10.3389/fbioe.2024.1328997
4. Reilly R. F. (2016). Medical and surgical care during the American Civil War, 1861-1865. Proceedings (Baylor University. Medical Center), 29(2), 138–142. https://doi.org/10.1080/08998280.2016.11929390
Medical Education in the Post-Civil War South
- Ashley Isenberg
The Civil War not only transformed the political and social landscape of the United States but also fundamentally reshaped the medical profession, particularly in the South. While Northern physicians entered the conflict with stronger institutional ties to hospitals, more reliable access to cadavers, and a tradition of clinical instruction, Southern medical schools were often proprietary institutions with limited resources and inconsistent curricula.[1] When the war began, most Southern schools closed, leaving only the Medical College of Virginia open. In the postwar period, the South faced the challenge of rebuilding its devastated infrastructure while also modernizing its medical education. This essay explores how Southern schools adapted after the Civil War, emphasizing the reforms that gradually aligned them with Northern standards.
Before the war, Southern medical education was characterized by shorter lecture terms, limited emphasis on dissection, and minimal hospital training. Whereas Northern schools benefited from anatomy acts such as Massachusetts’ 1831 law and New York’s 1854 “Bone Bill,” which secured a legal cadaver supply, Southern students relied on grave robbing or personal appeals to governors for access to anatomical subjects.[2] Clinical instruction was similarly limited. At the University of Virginia in 1850, for example, candidates were required to complete two lecture courses in anatomy, medicine, and chemistry, but there was no mandatory hospital component. Charlottesville General Hospital did not even open until 1861, leaving graduates with little hands-on surgical experience.
The wartime consequences of this educational gap were Southern surgeons that were thrust into battlefield hospitals with minimal operative experience, often learning amputation techniques on wounded soldiers rather than in a controlled learning environment. In contrast, many Northern professors—including Samuel Gross of Jefferson Medical College and William Hammond of the University of Maryland—directly authored surgical manuals or led the Union’s Medical Department.[3] The South entered the war at a distinct disadvantage, both in training and institutional support.
Given the lack of prior training, the Confederacy was forced to improvise and learn on the spot. The Confederate Surgeon General, Samuel Preston Moore, commissioned new manuals tailored to resource-limited settings, the most notable being J.J. Chisolm’s Manual of Military Surgery (1861).[4] Southern surgeons turned to local herbal remedies when cut off from imported drugs like quinine and morphine, and many younger physicians gained surgical exposure only through battlefield necessity. While these adaptations revealed ingenuity, they underscored the lack of a standardized medical education system prior to the war.
Following the war, southern states rapidly responded to deficits in medical training highlighted by the war. By the 1870s, institutions such as Tulane, Vanderbilt, and the Medical College of Georgia began to re-emerge as important centers of medical education. These schools consciously modeled themselves on Northern universities, adopting graded multi-year curricula that replaced the traditional two terms of lectures. This shift was significant: students were now expected to progress through increasingly advanced coursework rather than repeat the same set of lectures twice.
Dissection also became more standardized in the postwar years. Although a uniform legal framework for cadaver donation would not arrive until the twentieth century with the Uniform Anatomical Gift Act (1968), Southern schools gradually secured more consistent access to anatomical material, bringing their training closer in line with national norms.[5]
Perhaps the most important reform in the South was the integration of hospitals into medical education. Prewar Southern schools were often located in smaller towns or rural settings without large patient populations, which severely limited clinical exposure. After the war, schools in urban centers such as New Orleans, Richmond, and Nashville developed partnerships with hospitals that allowed students to gain bedside experience. This mirrored the longstanding model of Northern institutions like the University of Pennsylvania, which had required attendance at Philadelphia Hospital or Pennsylvania Hospital as early as 1840.[6]
The new emphasis on hospital-based training in the South not only improved the quality of education but also aligned with broader national trends. By the late nineteenth century, the apprenticeship model was giving way to an institutional model in which hospitals, laboratories, and formal curricula defined the physician’s training.
The Civil War exposed the weaknesses of Southern medical education, but it also created an impetus for reform. Confederate alumni such as Joseph Jones of the Medical College of Georgia later became advocates for scientific rigor and clinical observation.[7] By joining the American Medical Association in greater numbers after the war, Southern schools linked themselves to national debates about standards, curricula, and licensing. Over time, these reforms helped Southern schools move away from the proprietary, profit-driven model that had dominated before the war.
Still, progress was uneven. Many institutions struggled financially, and it was not until the Flexner Report of 1910 that sweeping, nationwide reform would fully eliminate underperforming proprietary schools.[8] Nevertheless, the Civil War created a moment of reckoning for the South. By demonstrating the human cost of underprepared surgeons, the conflict forced Southern educators to invest in anatomy, hospital partnerships, and graded curricula that better prepared their graduates for the demands of modern medicine.
References
[1] Medical Lecture Tickets: Historical Narrative. University Archives and Records Center. Published March 29, 2018. Accessed September 19, 2025. https://archives.upenn.edu/exhibits/penn-history/medical-lecture-tickets/history/
[2] Anatomical Theatre at the University “Subjects” for Anatomy Class. Virginia.edu. Published 2023. https://exhibits.hsl.virginia.edu/anatomical-theatre/subjects-for-anatomy-class/index.html
[3] Hammond, William A. Circular. No. 2. Washington City: Surgeon General’s Office, 1862. Print.
[4] Chisolm’s Manual of Military Surgery, Civil Practice to Civil War: The Medical College of the State of South Carolina 1861-1865. Musc.edu. Published 2025. Accessed September 19, 2025. https://waring.library.musc.edu/exhibits/civilwar/ChisolmMMS.php?
[5] Sadler AM. The Uniform Anatomical Gift Act. JAMA. 1968;206(11):2501. doi:https://doi.org/10.1001/jama.1968.03150110049007
[6] I I. Accessed September 17, 2025. https://archives.upenn.edu/media/2017/10/catalogue-1840-41.pdf
[7] JOSEPH JONES: CONFEDERATE SURGEON - ProQuest. Proquest.com. Published 2025. Accessed September 19, 2025. https://www.proquest.com/docview/288310009?fromopenview=true&pq-origsite=gscholar&sourcetype=Dissertations%20&%20Theses
[8] February 2010 - Volume 85 - Issue 2 : Academic Medicine. Lww.com. Published 2025. Accessed September 19, 2025. https://journals.lww.com/academicmedicine/abstract/2010/02000/abraham_flexner_of_kentucky
The Civil War not only transformed the political and social landscape of the United States but also fundamentally reshaped the medical profession, particularly in the South. While Northern physicians entered the conflict with stronger institutional ties to hospitals, more reliable access to cadavers, and a tradition of clinical instruction, Southern medical schools were often proprietary institutions with limited resources and inconsistent curricula.[1] When the war began, most Southern schools closed, leaving only the Medical College of Virginia open. In the postwar period, the South faced the challenge of rebuilding its devastated infrastructure while also modernizing its medical education. This essay explores how Southern schools adapted after the Civil War, emphasizing the reforms that gradually aligned them with Northern standards.
Before the war, Southern medical education was characterized by shorter lecture terms, limited emphasis on dissection, and minimal hospital training. Whereas Northern schools benefited from anatomy acts such as Massachusetts’ 1831 law and New York’s 1854 “Bone Bill,” which secured a legal cadaver supply, Southern students relied on grave robbing or personal appeals to governors for access to anatomical subjects.[2] Clinical instruction was similarly limited. At the University of Virginia in 1850, for example, candidates were required to complete two lecture courses in anatomy, medicine, and chemistry, but there was no mandatory hospital component. Charlottesville General Hospital did not even open until 1861, leaving graduates with little hands-on surgical experience.
The wartime consequences of this educational gap were Southern surgeons that were thrust into battlefield hospitals with minimal operative experience, often learning amputation techniques on wounded soldiers rather than in a controlled learning environment. In contrast, many Northern professors—including Samuel Gross of Jefferson Medical College and William Hammond of the University of Maryland—directly authored surgical manuals or led the Union’s Medical Department.[3] The South entered the war at a distinct disadvantage, both in training and institutional support.
Given the lack of prior training, the Confederacy was forced to improvise and learn on the spot. The Confederate Surgeon General, Samuel Preston Moore, commissioned new manuals tailored to resource-limited settings, the most notable being J.J. Chisolm’s Manual of Military Surgery (1861).[4] Southern surgeons turned to local herbal remedies when cut off from imported drugs like quinine and morphine, and many younger physicians gained surgical exposure only through battlefield necessity. While these adaptations revealed ingenuity, they underscored the lack of a standardized medical education system prior to the war.
Following the war, southern states rapidly responded to deficits in medical training highlighted by the war. By the 1870s, institutions such as Tulane, Vanderbilt, and the Medical College of Georgia began to re-emerge as important centers of medical education. These schools consciously modeled themselves on Northern universities, adopting graded multi-year curricula that replaced the traditional two terms of lectures. This shift was significant: students were now expected to progress through increasingly advanced coursework rather than repeat the same set of lectures twice.
Dissection also became more standardized in the postwar years. Although a uniform legal framework for cadaver donation would not arrive until the twentieth century with the Uniform Anatomical Gift Act (1968), Southern schools gradually secured more consistent access to anatomical material, bringing their training closer in line with national norms.[5]
Perhaps the most important reform in the South was the integration of hospitals into medical education. Prewar Southern schools were often located in smaller towns or rural settings without large patient populations, which severely limited clinical exposure. After the war, schools in urban centers such as New Orleans, Richmond, and Nashville developed partnerships with hospitals that allowed students to gain bedside experience. This mirrored the longstanding model of Northern institutions like the University of Pennsylvania, which had required attendance at Philadelphia Hospital or Pennsylvania Hospital as early as 1840.[6]
The new emphasis on hospital-based training in the South not only improved the quality of education but also aligned with broader national trends. By the late nineteenth century, the apprenticeship model was giving way to an institutional model in which hospitals, laboratories, and formal curricula defined the physician’s training.
The Civil War exposed the weaknesses of Southern medical education, but it also created an impetus for reform. Confederate alumni such as Joseph Jones of the Medical College of Georgia later became advocates for scientific rigor and clinical observation.[7] By joining the American Medical Association in greater numbers after the war, Southern schools linked themselves to national debates about standards, curricula, and licensing. Over time, these reforms helped Southern schools move away from the proprietary, profit-driven model that had dominated before the war.
Still, progress was uneven. Many institutions struggled financially, and it was not until the Flexner Report of 1910 that sweeping, nationwide reform would fully eliminate underperforming proprietary schools.[8] Nevertheless, the Civil War created a moment of reckoning for the South. By demonstrating the human cost of underprepared surgeons, the conflict forced Southern educators to invest in anatomy, hospital partnerships, and graded curricula that better prepared their graduates for the demands of modern medicine.
References
[1] Medical Lecture Tickets: Historical Narrative. University Archives and Records Center. Published March 29, 2018. Accessed September 19, 2025. https://archives.upenn.edu/exhibits/penn-history/medical-lecture-tickets/history/
[2] Anatomical Theatre at the University “Subjects” for Anatomy Class. Virginia.edu. Published 2023. https://exhibits.hsl.virginia.edu/anatomical-theatre/subjects-for-anatomy-class/index.html
[3] Hammond, William A. Circular. No. 2. Washington City: Surgeon General’s Office, 1862. Print.
[4] Chisolm’s Manual of Military Surgery, Civil Practice to Civil War: The Medical College of the State of South Carolina 1861-1865. Musc.edu. Published 2025. Accessed September 19, 2025. https://waring.library.musc.edu/exhibits/civilwar/ChisolmMMS.php?
[5] Sadler AM. The Uniform Anatomical Gift Act. JAMA. 1968;206(11):2501. doi:https://doi.org/10.1001/jama.1968.03150110049007
[6] I I. Accessed September 17, 2025. https://archives.upenn.edu/media/2017/10/catalogue-1840-41.pdf
[7] JOSEPH JONES: CONFEDERATE SURGEON - ProQuest. Proquest.com. Published 2025. Accessed September 19, 2025. https://www.proquest.com/docview/288310009?fromopenview=true&pq-origsite=gscholar&sourcetype=Dissertations%20&%20Theses
[8] February 2010 - Volume 85 - Issue 2 : Academic Medicine. Lww.com. Published 2025. Accessed September 19, 2025. https://journals.lww.com/academicmedicine/abstract/2010/02000/abraham_flexner_of_kentucky
Anatomical Study Prior to the Civil War
- Jonathan Gagnon
In 1857, T.R. Roberts, medical student at the University of Virginia, pleaded to the governor of his state that “studying anatomy without subjects for demonstration is as fruitless as geometry without diagrams.” The sentiment behind Roberts’ words continues to ring true today, but they also raise a key issue in medical education prior to the Civil War: difficulties with acquiring cadavers for anatomical study. While the North had established legal precedence and organization for the collection of cadavers for medical education prior to the Civil War, the South relied on more clandestine and disorderly methods.
In the preparation of his letter to Governor Henry Wise, perhaps Roberts had heard horror stories from alums of his university, one of whom might have been A.F.E. Robertson who in 1834 was “shot in the back by an old fellow” while attempting a grave robbery to acquire a body for anatomical study. The legacy of grave robbing does not stop with Robertson, however, as Professor of Anatomy at the University of Virginia, Dr. Augustus Warner, devised a scheme to steal the university’s cart and horse to perform grave robbing at a larger scale. In this plan, Dr. Warner enlisted one of the university’s janitors to “break open the door to the stable, taking care, whilst so doing to commit the least possible injury to the property.” This clandestine operation was further developed in collaboration with “resurrectionists” who performed the body snatching themselves. Many times, however, the line between “resurrectionist” and “professor of anatomy” became one, as described in a letter by Medical College of Virginia Professor A.E. Peticolas: “to continue my lectures I was forced to play resurrectionist myself; by no means a pleasant profession, when the snow is 8 inches deep and the thermometer near zero.” In the absence of legal precedence, anatomy professors and students in the South formed “pseudo crime rings” to acquire bodies for anatomical study.
These records of grave robbing highlight that anatomical dissection did take place in the South, however the difficulties associated with acquiring bodies likely limited the supply of cadavers for study in southern medical schools. Therefore, medical students in the South had less exposure to primary dissections and likely learned anatomy in a more theoretical or academic sense. Given the lack of systemic documentation for health outcomes in the Confederacy during the Civil War, it is difficult to assess what role this diminished exposure to anatomy had on health outcomes, and how that might have contributed to the outcomes of the war itself. Nonetheless, the extensive nature of graverobbing operations by students and professors provide a strong contrast to the environment experienced by medical students in the North prior to the Civil War.
While not entirely immune to the practice of graverobbing, the North had a more cohesive system for delivering bodies to medical schools for anatomical study, established by the Massachusetts Anatomy Act of 1831 and the New York Bone Bill of 1854. The Massachusetts Act, titled “An act more effectually to protect the sepulchres of the dead, and to legalize the study of anatomy in certain cases” created a system that would deliver “unclaimed bodies” to physicians “within twenty-four hours from and after death.” This act, in conjunction with the New York Bone Bill, provided powerhouses in medical education such as Harvard Medical School and University Medical College of New York University with ample bodies for anatomical study. Going beyond advantages in legal backing, northern medical schools were located in more urban settings, compared to the rural environment of most southern medical schools. This disparity is highlighted in correspondence between Thomas Jefferson of the University of Virginia and Dr. Philip Physick of the University of Pennsylvania. In an 1824 letter responding to Jefferson’s request for advice on the development of an anatomy department, Physick mentions how “in our dissecting rooms every facility of dissecting and making preparations is afforded, the supply of subjects on moderate terms, being ample.” Given that this correspondence took place prior to any legal acts that provided a system for cadaver delivery to medical schools, it highlights that the dense population centers in the North likely provided greater access to cadavers.
Ultimately, the study of anatomy was troublesome for both northern and southern medical schools prior to the Civil War. Grave robbing was the predominate form of acquiring bodies for dissection in both regions, however legal acts allowed northern medical schools to develop a more sophisticated method of acquiring bodies for dissection. For southern medical schools, students and professors organized with resurrectionists to acquire bodies and would also perform the grave robbing themselves. Assessments of how discrepancies in the study of anatomy between the North and South impacted the war are difficult to make given less significant data collection in the South.
References:
1. Anatomical Theatre at the University “Subjects” for Anatomy Class. Virginia.edu. Published 2023. Accessed September 17, 2025. https://exhibits.hsl.virginia.edu/anatomical-theatre/subjects-for-anatomy-class/index.html
2. Report of the Select Committee of the House of Representatives on so much of the governor’s speech, at the June session, 1830, as relates to legalizing the study of anatomy - Digital Collections - National Library of Medicine. Nih.gov. Accessed September 17, 2025. http://resource.nlm.nih.gov/61111250R?_gl=1
3. Breeden JO. Body Snatchers and Anatomy Professors: Medical Education in Nineteenth Century Virginia. The Virginia Magazine of History and Biography. 1975;83(3):321-345. doi:https://doi.org/10.2307/4247966
4. NYU Langone Health History | The Lillian & Clarence de la Chapelle Medical Archives. Nyu.edu. Published 2019. https://archives.med.nyu.edu/about/nyu-langone-health-history
5. Gates E. Theatre of the Macabre. UVA Magazine. Accessed September 19, 2025. https://uvamagazine.org/articles/theatre_of_the_macabre
6. T.R. Roberts to Henry A. Wise, January 12, 1857. Governor Henry A. Wise Executive Papers, 1856-1859, Box 6: Folder 2. Accession #36710, January 14, 1857, The Library of Virginia, Richmond, Virginia.
In 1857, T.R. Roberts, medical student at the University of Virginia, pleaded to the governor of his state that “studying anatomy without subjects for demonstration is as fruitless as geometry without diagrams.” The sentiment behind Roberts’ words continues to ring true today, but they also raise a key issue in medical education prior to the Civil War: difficulties with acquiring cadavers for anatomical study. While the North had established legal precedence and organization for the collection of cadavers for medical education prior to the Civil War, the South relied on more clandestine and disorderly methods.
In the preparation of his letter to Governor Henry Wise, perhaps Roberts had heard horror stories from alums of his university, one of whom might have been A.F.E. Robertson who in 1834 was “shot in the back by an old fellow” while attempting a grave robbery to acquire a body for anatomical study. The legacy of grave robbing does not stop with Robertson, however, as Professor of Anatomy at the University of Virginia, Dr. Augustus Warner, devised a scheme to steal the university’s cart and horse to perform grave robbing at a larger scale. In this plan, Dr. Warner enlisted one of the university’s janitors to “break open the door to the stable, taking care, whilst so doing to commit the least possible injury to the property.” This clandestine operation was further developed in collaboration with “resurrectionists” who performed the body snatching themselves. Many times, however, the line between “resurrectionist” and “professor of anatomy” became one, as described in a letter by Medical College of Virginia Professor A.E. Peticolas: “to continue my lectures I was forced to play resurrectionist myself; by no means a pleasant profession, when the snow is 8 inches deep and the thermometer near zero.” In the absence of legal precedence, anatomy professors and students in the South formed “pseudo crime rings” to acquire bodies for anatomical study.
These records of grave robbing highlight that anatomical dissection did take place in the South, however the difficulties associated with acquiring bodies likely limited the supply of cadavers for study in southern medical schools. Therefore, medical students in the South had less exposure to primary dissections and likely learned anatomy in a more theoretical or academic sense. Given the lack of systemic documentation for health outcomes in the Confederacy during the Civil War, it is difficult to assess what role this diminished exposure to anatomy had on health outcomes, and how that might have contributed to the outcomes of the war itself. Nonetheless, the extensive nature of graverobbing operations by students and professors provide a strong contrast to the environment experienced by medical students in the North prior to the Civil War.
While not entirely immune to the practice of graverobbing, the North had a more cohesive system for delivering bodies to medical schools for anatomical study, established by the Massachusetts Anatomy Act of 1831 and the New York Bone Bill of 1854. The Massachusetts Act, titled “An act more effectually to protect the sepulchres of the dead, and to legalize the study of anatomy in certain cases” created a system that would deliver “unclaimed bodies” to physicians “within twenty-four hours from and after death.” This act, in conjunction with the New York Bone Bill, provided powerhouses in medical education such as Harvard Medical School and University Medical College of New York University with ample bodies for anatomical study. Going beyond advantages in legal backing, northern medical schools were located in more urban settings, compared to the rural environment of most southern medical schools. This disparity is highlighted in correspondence between Thomas Jefferson of the University of Virginia and Dr. Philip Physick of the University of Pennsylvania. In an 1824 letter responding to Jefferson’s request for advice on the development of an anatomy department, Physick mentions how “in our dissecting rooms every facility of dissecting and making preparations is afforded, the supply of subjects on moderate terms, being ample.” Given that this correspondence took place prior to any legal acts that provided a system for cadaver delivery to medical schools, it highlights that the dense population centers in the North likely provided greater access to cadavers.
Ultimately, the study of anatomy was troublesome for both northern and southern medical schools prior to the Civil War. Grave robbing was the predominate form of acquiring bodies for dissection in both regions, however legal acts allowed northern medical schools to develop a more sophisticated method of acquiring bodies for dissection. For southern medical schools, students and professors organized with resurrectionists to acquire bodies and would also perform the grave robbing themselves. Assessments of how discrepancies in the study of anatomy between the North and South impacted the war are difficult to make given less significant data collection in the South.
References:
1. Anatomical Theatre at the University “Subjects” for Anatomy Class. Virginia.edu. Published 2023. Accessed September 17, 2025. https://exhibits.hsl.virginia.edu/anatomical-theatre/subjects-for-anatomy-class/index.html
2. Report of the Select Committee of the House of Representatives on so much of the governor’s speech, at the June session, 1830, as relates to legalizing the study of anatomy - Digital Collections - National Library of Medicine. Nih.gov. Accessed September 17, 2025. http://resource.nlm.nih.gov/61111250R?_gl=1
3. Breeden JO. Body Snatchers and Anatomy Professors: Medical Education in Nineteenth Century Virginia. The Virginia Magazine of History and Biography. 1975;83(3):321-345. doi:https://doi.org/10.2307/4247966
4. NYU Langone Health History | The Lillian & Clarence de la Chapelle Medical Archives. Nyu.edu. Published 2019. https://archives.med.nyu.edu/about/nyu-langone-health-history
5. Gates E. Theatre of the Macabre. UVA Magazine. Accessed September 19, 2025. https://uvamagazine.org/articles/theatre_of_the_macabre
6. T.R. Roberts to Henry A. Wise, January 12, 1857. Governor Henry A. Wise Executive Papers, 1856-1859, Box 6: Folder 2. Accession #36710, January 14, 1857, The Library of Virginia, Richmond, Virginia.
Antiseptic and Hygienic Practices During the Civil War
- Jonathan Yu
The Civil War is considered to be one of the deadliest conflicts in American History (Digital History, n.d.). A significant reason for this was the rapid advancements in military technology, particularly the introduction of the Minié ball- a new type of lead bullet that had a shorter reload time, improved accuracy, greater range, and devastating power (HistoryNet, n.d.). Upon impact, the Minié ball splintered inside the body, causing catastrophic injuries. This led to a dramatic rise in amputations and, consequently, a surge in infections. Exacerbation resulted in a rise in infections. Exacerbated by unhygienic practices, disease and illness were responsible for over two-thirds of deaths during the Civil War.Septic practices during the war were still in their infancy. The vast majority of physicians and the public believed in the “Miasma” Theory, a theory centered around the belief that inhalation of air infected with corrupting matter was the root cause of disease (Halliday, 2001). This belief shaped the design of military camps, where latrines were often placed at the edge of camp to prevent the spread of “bad air”. However, because the latrines were so far away, some soldiers used the opportunity to desert. Guards were eventually stationed near latrines, but the distance and lack of privacy discouraged their use. As a result, many soldiers relieved themselves near their living quarters, contaminating food and water supplies (Civil War Monitor, n.d.).
Not all military camps were equally unsanitary. Dr. Mütter, a renowned surgeon and professor at Jefferson Medical College, was an early proponent of cleanliness. His student, Colonel Daniel Leasure, commanded a Union infantry regiment and instilled strict hygiene practices among his men during the Civil War and instilled the principles of hygiene within those that he commanded (Price, 2020). His remarkable success in limiting deaths by disease demonstrated the lifesaving importance of sanitation.
Disinfectants also began to see wilder use during this period, however, not precisely in the way that we conventionally think they are used today. In line with miasma theory, hospitals were kept smelling fresh to prevent the transmission of disease. When fresh air was unavailable, “disinfectants” such as lead nitrate, zinc chloride, charcoal, sulfate of lime, and carbolic acid were used to mask odors (Price, 2019). Still, despite the rise in disinfectant usage, people were predominantly using them to make the air smell better, rather than to disinfect tables and surgical tools.
Antiseptic practices also began to arise during the Civil War despite the lack of acceptance of Germ theory. In 1863, Dr. Goldsmith experimented with the use of bromine as an antiseptic to help treat hospital gangrene while serving the Army of the Cumberland. Goldsmith’s application of bromine reduced mortality from 46% to 3% (McIntire, 2023). Additionally, some surgeons began quarantining patients and ensuring that each had their own sponge, towel, and bedding. These small, but significant measures foreshadowed later advances in antiseptic technique.
The overall toll of disease on the Union Army was devastating. Pneumonia caused 1.7 million cases and 45,000 deaths; typhoid, 149,000 episodes and 35,000 deaths; diarrhea/dysentary, 360,0000 cases and 21,000 deaths; and malaria, 1,316,000 and 10,000 deaths (Sartin, 1993). These harrowing statistics highlight the devastating role of disease in the war.
One striking example occurred during General McClellan’s Peninsula Campaign in the Spring of 1862. He planned to land troops on the Virginia Peninsula and take over Richmond. However, diseases like malaria, typhoid, and dysentery ravaged his army, reducing his strength by more than one-third. There were nearly three disease episodes per soldier over 9 months. In the end, McClellan had to retreat after losing several battles to Robert E. Lee, likely prolonging the war by 1 to 2 years (Sartin, 1993).
American physicians also drew inspiration from Florence Nightingale’s reforms during the Crimean War, which emphasized ventilation and cleanliness. In 1861, Richmond, Virginia., saw the construction of the first pavilion-style hospital in the United States. These hospitals, designed to maximize airflow, helped patients recover more quickly and achieved mortality rates as low as 8% (Price, 2018). Their success led to rapid construction of similar styled hospitales throughout the war.
The creation of the United States Sanitary Commission in 1861 further advanced military medicine. The commission inspected soldiers and camps, advised on disease prevention, and lead relief efforts (Ullman, n.d.). It published a series of essays to guide physicians, many of whom had come from small towns with little experience treating battlefield injuries, in camp hygiene and appropraite care for the sick and wounded. These essays provided a much-needed standard of practice during the conflict.
Today, antiseptic and hygienic practices are vastly different. The acceptance of germ theory transformed medicine, allowing physicians to understand how disease spreads, and how to prevent it. Surgical instruments are now sterilized between each operations, and surgeons wear masks, gloves, and gowns to reduce infection risk. Sewage systems and modern toilets prevent contamination of water and food supplies. Building on the principles of pavilion-style hospitals, modern clinics are designed with private rooms and that there is adequate ventilation. The practices and techniques developed and understood during the Civil War serve as a critical foundation for modern medicine in the United States today.
Works Cited
Digital history. (n.d.). Retrieved September 19, 2025, from https://www.digitalhistory.uh.edu/disp_textbook.cfm?smtid=2&psid=3062#:~:text=The%20Civil%20War%20was%20the,and%20World%20War%20II%20combined.
Halliday S. (2001). Death and miasma in Victorian London: an obstinate belief. BMJ (Clinical research ed.), 323(7327), 1469–1471. https://doi.org/10.1136/bmj.323.7327.1469
Minie ball: The civil war bullet that changed history. (n.d.). HistoryNet. Retrieved September 19, 2025, from https://www.historynet.com/minie-ball/
McIntire, T. (2023, July 19). Hospital gangrene in the civil war. National Museum of Civil War Medicine. https://www.civilwarmed.org/hospital-gangrene-in-the-civil-war/
Price, D. (2018, February 20). The innovative design of civil war pavilion hospitals. National Museum of Civil War Medicine. https://www.civilwarmed.org/pavilion-hospitals/
Price, D. (2019, October 1). Fighting disease with smell: “Disinfection” during the civil war. National Museum of Civil War Medicine. https://www.civilwarmed.org/disinfection/
Price, D. (2020, June 29). Germ theory from antiquity to the antebellum period. National Museum of Civil War Medicine. https://www.civilwarmed.org/germ-theory-antebellum/
Reilly R. F. (2016). Medical and surgical care during the American Civil War, 1861-1865. Proceedings (Baylor University. Medical Center), 29(2), 138–142. https://doi.org/10.1080/08998280.2016.11929390
Sartin J. S. (1993). Infectious diseases during the Civil War: the triumph of the "Third Army". Clinical infectious diseases : an official publication of the Infectious Diseases Society of America, 16(4), 580–584. https://doi.org/10.1093/clind/16.4.580
The enemy within. (n.d.). Civil War Monitor. Retrieved September 19, 2025, from https://www.civilwarmonitor.com/article/the-enemy-within/
Ullman, D. (n.d.). The united states sanitary commission. American Battlefield Trust. Retrieved September 19, 2025, from https://www.battlefields.org/learn/articles/united-states-sanitary-commission
Subscribe to:
Posts (Atom)