Threat Assessments and ‘Leakage’:
What Your Teams Don’t Know Can Hurt You
Written by Chelsea Payne, Threat Analyst for Safer Schools Together
Those who work in the field of Threat Assessment know how important it is to identify leakage. According to “The Concept of Leakage in Threat Assessment” by J. Reid Meloy and Mary Ellen O’Toole, the definition of leakage is as follows:
Leakage in the context of threat assessment is the communication to a third party of an intent to do harm to a target. Third parties are usually other people, but the means of communication vary, and include letters, diaries, journals, blogs, videos on the internet, emails, voice mails, and other social media forms of transmission. Leakage is a type of warning behavior that typically infers a preoccupation with the target, and may signal the research, planning, and implementation of an attack.
While many may be taught to look for traditional forms of leakage, Digital Threat Assessment® teaches participants how to find and assess online and digital leakage. As a Threat Analyst for Safer Schools Together (SST), I am trained to pick up on worrisome data on social media that may otherwise get overlooked. To better understand what leakage is and how it can impact school safety, here is a real-world example I worked through recently.
While conducting scans for a school district, I came across an Instagram Story from a student that caught my attention. It was a long story, about 60-seconds (not the usual 15-second story you typically see). To the average viewer, it may have appeared to be a poorly taken video of a boy riding his bike across the train tracks. But, it wasn’t the video that was worrisome—the audio was another story.
As I turned the volume up and listened over and over to the boy on the bike, I heard him talking in a serious tone to himself saying, “This is me when I want to die, why do I want to die? I hate myself.” As the video continued, there was a shift in the tone of the student. Now he was pleading, “I can’t, I can’t, I can’t. I’m sorry, I don’t want to scare you guys.”
Knowing the severity of the situation at hand, I found the student’s TikTok account, and there I was – confronted with even more worrisome content aligning with the original story posted on Instagram. The posts indicated that the student had visited the train tracks three different times; the first was a video of the train going by with no commentary. The second video was the first instance of the student approaching the train and discussing his intention to take his own life. The third video was similar to the second, and in it, he was putting himself down and pleading. In addition, the student also had content on his Instagram and TikTok that was self-deprecating. He was calling himself ugly and saying things like, “You hate me, don’t you? It’s okay, don’t worry because I hate me too.”
Evaluation of the Case
What I assessed while compiling my Worrisome Online Behavior Report (WOB) was that there was a shift in the student’s baseline within the month. I also noted that the self-deprecating posts on his Instagram and TikTok appeared to be a baseline behavior for the student. There were three separate visits to the train tracks that showed a means of how he would take his life and that he was preparing himself for it—what we recognize as rehearsal behavior. The student’s frequency of posts had increased in the past four weeks and the intensity and recency of the posts also contributed to the severity of this case.
After the report had been sent, Safer Schools Together (SST) was notified that the school had made contact with the boy since receiving the WOB Report. It was soon revealed that he had planned on taking his life by standing in front of the train within a matter of days. This was truly a life-or-death situation. Because of the Worrisome Online Behavior Report sent, the school opened a Threat Assessment case on the student to create a safety plan and coordinate critical supportive services to help the student. Supports included connection with a Mental Health counselor and outreach support for the family.
Advice from a Threat Analyst
It is human nature for us to pick up our phones, jump on social media, and unconsciously scroll past trends, funny posts, sad posts, and posts of family or friends. As Threat Analysts, our scrolling is anything but unconscious—we are looking at every detail, listening, observing, and analyzing every post. We can work together this upcoming year to become more attentive, to be conscious scrollers through our social media feeds and be willing to act when we see something even slightly worrisome, or a cry for help.
According to SST Threat Analyst Luke Schwarm, “Worrisome content is missed very often. In terms of the content itself, I think a lot of the things we find would be missed by most people and software. Digital Threat Assessment® training teaches you to look for certain behaviors, slang, aesthetics, trends, etc. that are commonly associated with the content.” He adds, “I think it is important to know what kind of things you are looking for and to not judge or get too involved in the actual content.”
Early identification is the goal of Digital Threat Assessment® training. It’s only after worrisome behavior is identified that School Safety and Threat Assessment teams can provide subjects of concern with early intervention measures and a support system to get on a pathway to success.
Learn more about Digital Threat Assessment® training from the International Center for Digital Threat Assessment (ICDTA) and Safer Schools Together – discover how it can help your school threat assessment teams gather digital data to inform your next steps in a behavioral threat assessment and put immediate risk-reducing interventions in place.