
If thereâs a heaven, there must be a special place in it for those who have lived through hell and who manage to salvage something from their own purgatorial nightmare to help others.
Consider, for example, Jean Baker. Spurred on by the suicide of her 21-year-old son, Brian Case, in 1999, Baker, who lives in Houston, became a teacher for special needs children. She has also written a bookâSchizophrenia: Evolving from My Sonâs Suicide to the Classroomâand works as an outspoken advocate for kids with mental illness. Still, sheâs always looking for more ways to translate her sonâs tormented life and death into something of permanence. âThere had not been any accounting of his passing,â she says. âThey didnât even put it in the newspaper in the town where he died.â
Last spring, she found another way to make Brianâs death count for something. Tuning in to Talk of the Nation on National Public Radio, she caught an interview with John Pestian, a clinical scientist at Cincinnati Childrenâs Hospital Medical Center who is leading a project that is attempting to use computer technology to identify people who are suicidal. Pestianâs area of expertise is computational medicineâthat is, applying technology to help physicians and clinicians make decisions about treatment. The work that he discussed on NPR was focused on using information gleaned from suicide notes.
Baker had few artifacts from the last months of her sonâs life, when schizophrenia had rendered his world chaotic. But she did have the waterlogged message that police found in Brianâs pocket when his body was recovered from the Texas lake where he died. It began painfully: âMaybe my dying is for the best,â he wrote. It then devolved into the confused rambling characteristic of his illness before concluding with two clear and desperate questions: âWhere does it end? How does it end?â He answered by drowning himself.
Pestianâs project was so compelling to Baker that, she says, âI tracked him down.â Now her sonâs last words are part of an extraordinary collection of data: the language of approximately 1,400 people who have ended their own lives.
A computer databank may sound like a soulless repository for something so personal. But Baker sees it as a place where important remnants of Brianâs lifeâhis wordsâhave a place. Working with a research team at CCHMC, Pestian is looking for clues in language that can help reveal when a person is bound for self-destruction. In the complex, confounding mystery that is suicide, an early detection system like that could be revelatory. âThis means the world to me,â says Baker.
More people in the United States now commit suicide than die in car accidentsâabout one every 14 minutes or so. In Hamilton County in 2012, there were more suicides than homicidesâalmost one every three days.
If you spotted a drunk driver or a guy in a ski mask with a shotgun heading into a 7-Eleven, youâd call for help. But short of seeing someone clamber over a bridge railing, how can you tell when a person is liable to harm himself? How can you tell when another human has moved beyond ordinary despair or sadness or frustration or confusion, and in defiance of his bodyâs biological imperative to survive, is at risk of ending his life? Thatâs the puzzle that has driven John Pestian for the past eight years. Pestian is director of the hospitalâs Computational Medicine Centerâa collaboration with the University of Cincinnati and Ohioâs Third Frontier (the stateâs technology-based economic development initiative) that looks for ways to use technology to improve clinical practice.
âWe donât make decisions,â Pestian explains. âWe bring together data so that the clinician can make a better decision.â But on the computer in his small office thereâs a prototype of a tool that may, in the not-too-distant future, be in the hands of mental health workers. And it seems like science fiction.
On the screen is a Sim character that may very well be the most intuitive therapist this side of Good Will Hunting. Sheâs slim and young andâin this iterationâwhite. But she could be older, or African-American, or a man dressed in camo speaking with an Appalachian accent. âThereâs a whole lot of research about how to change the virtual human for who youâre dealing with,â Pestian says. âIt could be that kids want someone to look like a nurse, or [soldiers with PTSD] want it to look like a soldier.â
The goal of the Sim prototype isnât really to create the ideal mental health worker; it is to create one that a patient will talk to. Because itâs a patientâs responseâthe turns of phrase, the vocal inflection, each time shoulders droop or nostrils flareâthat Pestian and his team are mining for information, and that may someday help a nurse or doctor or therapist make a decision: Is this person a suicide risk?
Pestian says that the research started with a colleagueâs frustration. âA friendâa psychiatristâcame in to my office,â he recalls. âHe said, âWe just donât know what to do. We have 40 kids a week coming into our emergency room who are suicidal, and itâs just such a challenge to analyze and decide what to do with them.â â Some would be admitted for psychiatric care; others would be assessed, and because they didnât seem apt to harm themselves, sent home. But a good number of the ostensibly low risk kids would be back in the ER before long.
âThat was the genesis of it,â Pestian says. âHe said, âCan you help?â â
Pestianâs idea was to use linguistics-based data to see if it was possible to teach a computer to distinguish between someone who is genuinely driven to take his own life and someone who is not. And the one way to get the undisputable words of the former was clear: âWe said to ourselves, âOK, letâs get some suicide notes, and see what we can do with that.â â
He turned to Edwin Shneidman and his colleagues at the University of California, Los Angeles. Shneidman, who died in 2009 at the age of 91, was a pioneering psychologist. Among other things, he introduced the idea of suicide prevention. âHeâs the father of suicide research,â Pestian says, âa wonderful guy who, at 90, would go around in his nightshirt.â Shneidmanâs work not only launched a thousand crisis hotlines, it shaped much of our current thinking about self-destruction. And his archives included an unusual collection: scores of suicide notes he had amassed over the course of his career, some dating back to the 1950s. As luck would have it, he also had a group of fake notes. Back in the 1970s, as part of his own research, Shneidman asked a group of men at a union hall, âIf you were going to commit suicide, what would you write?â
The union hall experiment was, by contemporary research standards, ethically ambiguous at best. âYou couldnât do that today,â Pestian says. But the notes turned out to be important. âWe took the real notes and the pseudo-notes and we said, âWeâll see if we can tell the difference.â â
That meant creating software for sentiment analysisâa computer program that scrutinized the words and phrases of half the real suicide notes and learned how to recognize the emotion-laden language. They tested it by asking the computer to pick out the remaining real notes from the simulated ones. Then they had 40 mental health professionalsâpsychiatrists, social workers, psychologistsâdo the same. According to Pestian, the professionals were right about half the time; the computer was correct in 80 percent of the cases.
âSo we said, âOK, we can figure this out,â â he recalls. âIf the computer is taught how to listen, it will be able to listen to this database and say, âThis sounds like itâs suicidal.â Because there are patterns in the language that are the language of suicide.â Even if those patterns are not always apparent to a trained professional, the real note/fake note test held out the promise that a computer could learn to spot them.
âTheyâre there,â Pestian says.
Big research calls for big data, and Pestian has amassed it. Shneidmanâs collection has been expanded with notes from all over, sent to Pestian by professionals whoâve read about his research in medical journals and folks like Jean Baker who heard about it via news stories. He even received 200 from an agency in Poland (âAnd I donât speak Polish,â he says).
As notes arrived they were scanned, uploaded, and transcribed, pulling out names and dates and anything that might identify the writer. Then his team built a machine-learning softwareâwhat most people know as âartificial intelligenceââto look for patterns in suicidal language.
On a recent morning, braced by coffee from the hospital cafeteria, Pestian shows me how it works. Choosing a numbered document from the list on his screen, he opens it: I hope you will look after Mom, instructs the writer. Iâm sorry I have taken such a cowardly way out. I seem to be a beautiful misfit and I always will be.
âAw,â he says. âBeautiful misfit.â
Then he begins to annotate what has been written, highlighting phrasesâPlease look after Mom; Iâm sorry; beautiful misfitâand logging each one in a classification that appears in a pull-down menu. Is the writer feeling sorrow? Giving instructions to a family member? Those two are easily classified. But what about the beautiful misfit remark? Should the computer learn to recognize that as sorrow? Pride? Hopelessness? In an actual annotation process, three readers would go through each note, and their responses would be weighted.
Thatâs one of the things that separates this research from ordinary number-crunching. Pestian needed peopleâa lot of themâto read the notes and assign emotions to the words so that they could be digitally catalogued like shoe styles on Zappos.com. Over time, the computer would learn through the growing corpus of data that certain words and phrases signify âhopelessness,â for instanceâor that a writer who says âBe sure to pay the gas billâ and another who instructs âPlease look after momâ have the same mindset and share an emotional state with many others who have killed themselves.
He had an idea where he might find willing hands for the work. âThe literature will tell you that, on average, six people are profoundly affected by one suicide,â he says. âSo we called on those folks.â Pestian enlisted people whoâd lost a loved oneâ165 volunteersâto help get through the initial body of material. Subsequent notes, like the three he received the week before our meeting, are continuously added. (And yes, he could use someone who knows Polish.)
The project took years to developânot just because creating the software was challenging, but because âitâs a topic thatâs a challenge to get used to.
âIâve read every one of those notes,â Pestian adds. âIf you let it, itâll eat at you. So you have to put [them] down for a while.â
If you think that reading 1,400 suicide notes would be disheartening, youâd be right. Scan just a few and you canât help but feel a rising tide of hopelessness, not to mention sorrow, guilt, and affection.
I do not see myself getting well ever again.
Life is too great a struggle.
Please forgive what I am doing.
I love you. I love you. I love you.
Sometimes thereâs anger, too, as well as regret and recrimination. And often there are instructions to survivors about taking care of a decedentâs affairs. My car is at the service station or This check is to cover cost of burialâhousekeeping details from beyond the grave.
But if you think that reading 1,400 suicide notes would be a profoundly illuminating journey into the human psyche, you would be wrong.
âSome of the people who annotated the notes were disappointed that they didnât reveal more,â says Michelle Linn-Gust, past president of the American Association of Suicidology, who worked with Pestian to recruit volunteers for the project. âPeople think these notes are going to be confessional. Most are not. Itâs âMy jewelry goes to so-and-so.â Theyâve already checked out.â
Still, Linn-Gustâs volunteers were dedicated; the grimness of the task caused few to drop out of the effort, she says; some even asked for more work. Researchers havenât enlisted survivorsâpeople who have lost someone to suicideâvery often. âI knew we wouldnât have a problem getting people to do it,â she says. âWe want to help.â
Linn-Gust, who lost her own sister when she was in college, has a PhD in family studies; she writes and speaks about coping and bereavement. She wasnât surprised that the material in the study was notâon the surfaceâendlessly revealing. âMy sisterâs note said, âI know you love me; I just donât love myself,â â she says. âIt didnât tell us anything we didnât already know.â
What is surprising to her is the ingenious way Pestianâs team is using the notes. There has been a great deal of research on suicide risk factors such as depression, substance abuse, and family history. And today there are more ways to address these issuesâtherapy, family intervention, medicationâthan there were back in the 1950s, when Edwin Shneidman had to convince his contemporaries that someone with suicidal thoughts was not inherently insane. But, Linn-Gust says, âthe field has reached a plateau. Heâs working in a totally new direction.â
And Cincinnati, it seems, is an apt place to do it. Not because we have more of such deaths than usual (Hamilton Countyâs suicide rateâ10.63 per 100,000âis slightly less than the state average), but because CCHMC has an Emergency Department that is tied in to the issues. Jacqueline Grupp-Phelan, the hospitalâs director of research for the division of emergency medicine, studied children who came into the ER with suicidal ideationâthat is, kids who admitted they were âhaving thoughtsâ about hurting themselves. Anywhere from a third to half of the patients she tracked in a given year concerned the medical staff enough to be admitted. Of those who were evaluated and sent home, 20 percent were back in the ER within two months. âA very high risk group of kids,â Grupp-Phelan says. Though initially they didnât seem that way, even to trained professionals. And that was just a survey of kids who admitted to having dark thoughts. Boys in crisis arenât usually forthcoming, she says. âThey just kill themselves.â
Whether itâs being applied to adults or children, Grupp-Phelan says, what Pestian and his team are doing is trying to create âanother way of getting at risk.â Theyâre trying to deliver insights that get past a childâs reluctance to talk; that wonât be compromised by an adultâs bravado; that canât be distorted by a staff memberâs poor nightâs sleep the evening before.
In laymanâs terms, the goal is for a clinician to be able to take a person in crisis and read him like a book. Even when they have to read between the lines.
Remember the Sim on Pestianâs computer? To me, the virtual human looks a whole lot like Lesley Rohlfs, the project coordinator for his lab. Maybe thatâs because sheâs sitting squarely in front of me, just like the Sim on the screen. The major difference being that she is very human and Iâm looking at her framed by two slender silver camerasâone facing me, one facing herâperched on tripods.
Rohlfs has a degree in counseling and human behavior and a background in pediatric psychiatry; for several years she was on the staff of CCHMCâs inpatient facility in College Hill. âI worked with what we always called âthe babies,â â she says, âpatients as young as 2 who were admitted for all types of mental health issues. Things like depression, aggressive behavior, andâunfortunatelyâsuicidal behavior.
âThe burnout rate working directly with patients is very high,â she says by way of explaining how she came to apply for a position on a team where her colleagues would be database developers and software gurus. Going into the interview, she didnât know much about the research. âWhen I found out, I had to have this job!â she says.
Several times a week, she takes her cameras to record interviews with a variety of patients who show up in the emergency room at CCHMC. She begins with a couple of standard mental health questionnaires. But what she tapes are the responses to five pretty simple queries at the end of the session:
Do you have hope?
Do you have any fears?
Do you have secrets?
Are you angry?
Does it hurt emotionally?
âWe call them ubiquitous questions,â she says. âTheyâre not specific to suicide. You can ask them of someone who comes in with a broken arm or a sore throat.â The cameras collect the words she and the patients use, the pauses, the timbre of their voices, their body movement, the way emotions flicker across their faces. The recordings of Rohlfs will teach the Sim to be more lifelike; the recordings of patients will be used to identify âthought markersââto distinguish the sentences, silences, gestures, and expressions of a suicidal state versus a non-suicidal one.
Rohlfsâs CCHMC interviews will join those being done at three other locations: University of Cincinnati Medical Center (sheâll begin adult interviews there this winter), a hospital in West Virginia, and one in Canada. Ultimately, the Thought Marker project will contain data from 500 people, âthe largest study ever for something like this,â Pestian says. And eventually the findings will be used in concert with data from another investigationâan exploration of the relationship between a suicidal personâs emotional state and their genetic traits.
Itâs a far sight more complicated than teaching a computer to tell the difference between a real suicide note and a phony, but Pestian thinks the principles are the same. âWe know we can tell from language whoâs suicidal and whoâs not,â he says. âWhen we combine video with audio, with language and genetics, does the suicidal person look differently than the non-suicidal person? Can we create a multi-modal test that looks at a personâs state and at a personâs traits and identifies suicidal people? Thatâs why this is so powerful.â
Interacting with a Sim on an iPad programmed to calculate the odds of your self-annihilation does sound pretty Minority Report-ish. But Pestian and Rohlfs emphasize that it will be just one more way to help a doctor or social worker as they assist a patientâa collection of data that can be part of their decision-making. It will be a tool, and Rolfes imagines it being used in a busy waiting room, while a patient is waiting for a face-to-face evaluation. âTheyâd be able to take that data in real time,â she saysâdata that offers a human care-giver an opinion about whether this man or woman, child or teen, is likely to harm himself at this point in time and whether they need to be protected from themselves.
Of course, having a tool that says this person is suicidal still doesnât answer the most fundamental question: Why? The solution to that puzzle lies outside the scope of Pestianâs research, but it was never far from his mind as he went through the 1,400 notes.
âHave you ever been with someone who is dying?â he says. âWe have a natural will to live; we kick ass to live. And these folks chose death. Why?â
He has his theoriesâabout genetic variance, about bio-chemical changes and emotional stressors. But for now heâs teaching computers how to listen to voices from the past and the present, hoping that the work makes a difference in the future.
Originally published in the January 2014 issue.
Facebook Comments