Some thoughts on how we might better counter military suicides
Imagine a young PV2 Smith newly arrived to the military with a bright future ahead of him or her, sleeping at 0200 in his or her barracks room at FT Anywhere in America.
By Maj. Karl A. Reuter, U.S. Army
Best Defense guest columnist
Imagine a young PV2 Smith newly arrived to the military with a bright future ahead of him or her, sleeping at 0200 in his or her barracks room at FT Anywhere in America. However, the private just received devastating news. Our soldier received a Snapchat pic of his or her boyfriend or girlfriend cheating at a party in their hometown, a thousand miles away. This, combined with the stress of being chastised earlier in the day for being late to formation by squad leader SSG Humptydump, causes the private to post on Facebook, “I feel there is no one to turn to.” No one in the chain of command sees the online post to reach out to the soldier, so PV2 Smith begins to google ways to hurt him or herself at work the next day. That night, the private goes to a local club, drinks too much, and gets a DUI on the way home. Weeks later, the soldier commits suicide after receiving a General Officer Memorandum of Reprimand.
I have been familiar with suicide and its selfishness all of my life. Approximately 38 years ago this winter, my uncle killed himself. I watched the emotional effects on my grandparents and my father and his siblings every holiday season while growing up, as they spoke and cried over a life never fully lived. Fourteen years later, I watched suicide wreak havoc on my family as a teenager, as another one of my father’s brothers decided life was no longer a solution. This time, it was much more personal to me due to being older, but I still didn’t understand it. When I became an adult and joined the military, I began to get a better grip on the struggles that individuals have in life, and saw touches of the “suicide solution” throughout my duty assignments across all ranks, gender, and race, and it always brought me back to that dirty word we did not dare speak of growing up.
It is no secret the military has been struggling with the dirty word in recent years, and how to prevent it. There are many studies and news articles out there that you can read and they will tell you that the services, in particular the Army and Marine Corps, have had issues in recent years with an alarming number of suicides of white males between 17 and 24 years of age. Despite knowing the high risk demographics, the question remains — why can’t we fix this issue? As a problem solving institution with an abundance of technological resources, how can we accept as a military that we have more deaths by suicide than we do wartime casualties over the past few years? The answer may lie in the MTV show, “Catfish.” The focus of the show brings together people who have lied about who they really are while trying to meet or date someone online using an “avatar” or “digital persona” that is distinctly different from their real self. Shows like “Catfish” demonstrate to us how a “digital persona” or “avatar” generated in correlation with social media becomes just as important, if not more important, than the physical domain to a younger generation which appears to be our most at-risk demographic. So the question then becomes: How do we take care of or monitor our service members’ “avatars” or “digital persona” when in the past we have focused almost solely on a solution to the physical domain in training resiliency, and not the digital one?
I would offer the following to Department of Defense military and civilian leaders interested in a technological, centrally managed “suicide solution” enhanced by decentralized human involvement and execution:
— Stand up a DoD Suicide Prevention Fusion Center that employs “analytical tools” to look for key words and phrases throughout the force. Businesses use analytical tools to watch for trends throughout the Internet, including social media sites, by searching for keywords. We can use the same or similar technology to data mine social media sites, as well employees’ workstations, to look at what they post online and search for. This should be a DISA (the ones who run the DoD networks) owned or informational fed, centralized Suicide Prevention Fusion Center that watches for hits on the NIPR and social media sites and notifies the service component or agency, that then in turn notifies the Emergency Operations Center or Senior Mission Commander by installation or owning command of the individual who has googled “asphyxiation,” “how to quietly kill myself,” or whatever key words or phrases are deemed appropriate to focus on. Once notified, the commander has the onus on him or her to get the service member a mental evaluation and treatment if deemed appropriate. This is a system that needs to be technologically, centrally managed at the DoD level, with decentralized execution at the lowest level possible by people who know the service member affected, due to the size of DoD and specifically the services.
— Utilize the newly formed Defense Innovation Unit Experimental (DIUx) to collaborate with and take on the recent practices of Facebook, Instagram, and Twitter, that all now have programs in place to address those whose “digital persona” or “avatar” show aspirations towards depression or suicidal tendencies online. Social media sites have a moral obligation to collaborate with DMDC, which manages the largest amount of information stored on DoD employees, who in turn can inform the Suicide Prevention Fusion Center of a social media posting by an identified DoD employee or service member. Once notified, the Suicide Prevention Fusion Center can figure out how to triage the situation by notifying the proper leadership or command team and/or by providing a popup on the individual’s computer screen providing them resources for help. This is another mitigating step, in hopes that the service member affected will reach out for treatment given the proper tools that he or she may or may not have already received through resiliency training.
If we implement the technological solutions above, we will catch the initial depressing Facebook post from PV2 Smith through collaboration with social media sites or by using our own analytical tools. In addition, if the affected individual is at work the next day googling how to inflict self-harm, we would be able to provide the proper resources for help or treatment, which we would be unaware, short of human interaction, that the individual needed. We have the technological capabilities to stop the inevitable DUI, Article 15, or next level trigger event, or sign for help from happening, before the domino effect becomes unstoppable and we have a ruined military career as well as the loss of a life not lived. I offer these possibilities as someone who hopes we can mitigate our current suicide issues throughout the military by employing easily available technology on a problem that we still don’t fully understand, to potentially prevent us from hearing that dirty word again and again.
The author is a military police officer currently assigned as a Army staff officer at the Pentagon. His opinions are his alone and do not reflect those of the United States Army or Department of Defense.
Image credit: Edouard Manet/Le Suicidé/The Yorck Project: 10.000 Meisterwerke der Malerei/Wikimedia Commons
More from Foreign Policy
No, the World Is Not Multipolar
The idea of emerging power centers is popular but wrong—and could lead to serious policy mistakes.
America Prepares for a Pacific War With China It Doesn’t Want
Embedded with U.S. forces in the Pacific, I saw the dilemmas of deterrence firsthand.
America Can’t Stop China’s Rise
And it should stop trying.
The Morality of Ukraine’s War Is Very Murky
The ethical calculations are less clear than you might think.