by James Porter
If we examine the history of stress management through the lens of America’s major wars, we discover a painful evolution—from misunderstanding and punishment to recognition and, eventually, real compassion. Nowhere is this shift more evident than in the treatment of trauma on the battlefield.
The Civil War: Misunderstood and Punished
During the Civil War, soldiers who experienced what we would now call traumatic stress were labeled as cowards. Back then, the prevailing belief was that any soldier who refused to fight, ran away, or exhibited signs of psychological collapse simply lacked moral fiber. These men were sometimes accused of desertion or even treason. In extreme cases, they were hanged or shot by firing squad as an example to others. There was no language to describe psychological trauma in medical terms, let alone compassion for it.
World War I: Shell Shock Emerges
By World War I, the scope and scale of combat had changed dramatically. The introduction of trench warfare and relentless artillery barrages gave rise to a new term: “shell shock.” The thinking at the time was that soldiers exposed to constant explosions were physically rattled—literally shaken to the core—by the noise and vibrations. While this explanation at least acknowledged that something real was happening to these men, it was still rooted in a physical understanding rather than a psychological one.
Still, “shell shock” became an accepted term to describe soldiers who couldn’t return to the front lines, who trembled uncontrollably, who experienced confusion, panic, or paralysis. Treatment remained rudimentary, and many soldiers were sent back into battle without adequate rest or recovery.
World War II: Battle Fatigue Takes Hold
By the time of World War II, medical professionals began to realize that trauma didn’t just affect those exposed to cannon fire. The same symptoms were showing up across all branches of the military—including among soldiers who had never been near an artillery barrage. The term “battle fatigue” replaced shell shock, reflecting a broader understanding that something psychological was happening.
“Battle fatigue” acknowledged the cumulative toll of combat: the sleepless nights, the constant danger, the moral injuries of killing or witnessing death. Treatment improved slightly—more rest, some reassignment—but stigma remained. Suffering from battle fatigue was still seen as a sign of weakness, and soldiers often hid their symptoms for fear of being sent home in disgrace.
The Vietnam War: The DSM and PTSD
It wasn’t until after the Vietnam War that psychological trauma began to be clinically recognized. In 1980, the American Psychiatric Association added Post-Traumatic Stress Disorder (PTSD) to the Diagnostic and Statistical Manual of Mental Disorders (DSM), which finally provided a formal medical framework for understanding war-related trauma.
During Vietnam, veterans suffering from symptoms identical to those in earlier wars were still often stigmatized, but the groundwork for change had been laid. By the late 20th century, PTSD had become a legitimate diagnosis, though many veterans still struggled in silence.
The Turning Point: Suicide Awareness
The real sea change came with the tragic increase in veteran suicides. As suicide rates among returning soldiers and veterans climbed, the military—and society at large—could no longer ignore the consequences of untreated trauma. PTSD was no longer a niche psychiatric diagnosis; it was a matter of life and death.
That grim reality helped shift public perception. Rather than viewing trauma as a moral failing or a weakness, we began to see it for what it truly is: an injury—real, serious, and deserving of care.
Only in recent years has the stigma around PTSD begun to lift. But it took more than a century—and countless lives—for that understanding to take hold.
Erica Tuminski
Author