Global Comment

Where the world thinks out loud

Mother Jones’ Bad Data on Mental Illness and Gun Violence

In the wake of a horrific series of mass shootings in the United States in 2012, a robust debate has begun about the public policy changes clearly needed to prevent such tragedies; though the debate has, thus far, sidestepped many of the foundational cultural issues at work here. People have been quick to blame the problem on guns, suggesting that eliminating guns will eliminate rampage violence, while others have targeted mental illness, using similarly eliminationist rhetoric.

Few have asked why it is that these shootings typically involve classically disaffected white men of a certain age, men who are perhaps growing up with a sense of entitlement and ownership. Few are probing the cultural attitudes that encourage men to think of the world as their own personal property, and fewer indeed seem to be looking at why it is that so many of the victims in these cases are women.

Mother Jones has been at the front of the pack, and it had a lead right out of the gate; in July, the venerable magazine had published the results of a detailed investigation into mass shootings in the US since 1982. Their report was continuously updated to account for new data, priming Mother Jones to be ready to go on 14 December, when the Sandy Hook shooting hit the news.

Almost immediately, people were turning to the Mother Jones reporting and widely distributing it, because it was readily available, because it came from a highly reputable source, and because it fit neatly with arguments frequently made in liberal communities about the origins of gun violence. Mother Jones leaned heavily on the mental illness angle in particular, and many people repeated the Mother Jones ‘finding’ that 38 of 62 mass shooters had displayed ‘signs of mental health problems’ before they killed.

Many people jumped to conclusions on the basis of this information: Two thirds of mass killers are mentally ill, ergo mentally ill people should be more closely monitored, regulated, and tracked. Perhaps there should even be a registry, and there should be more rigorous checks to keep guns and other weapons out of the hands of mentally ill people—despite the fact that mental illness exposes people to a higher risk of violence than the general population, and the fact that mentally ill people are primarily dangers to themselves, not others.

On the basis of one article, members of the mentally ill community had been tried and found wanting.

A series of fatal flaws occurred here. The first was a failure on the part of Mother Jones to disclose detailed information about the source of the data, let alone reveal the raw data set for everyone to analyse independently. Without this information available, it was difficult to make an informed and educated judgment about the validity of the conclusions reached in the article, which was presented, like other investigative journalism, as a detailed study and analysis of a social phenomenon. It was also presented in such a way that readers were expected to view it as authoritative; it fairly glittered with graphs, percentages, and statistics.

The second was a failure of reading. People assumed that Mother Jones was a reliable source, especially since the information they were reading was confirming their expectations that mental illness lies at the root of violence, and thus they happily absorbed and distributed the information. They didn’t think critically about the source of the data; where the information came from, how it was compiled, who analysed it and how it was analysed. Evidently, readers thought that data sets magically present themselves and are always readily understandable.

Along with many others, I called for a release of the raw data because I had a number of questions about it. I wanted to know where it came from; medical records? After-the-fact observations? News coverage? Police reports? These sources all mean very different things.

Medical records, for example, offered a detailed picture of a patient over time, complete with spontaneous observations linked to each medical visit, and in some cases may also include psychiatric records. After-the-fact observations and news reports, on the other hand, are very weak sources of information. Subject to confirmation bias, they include statements from people who are not psychiatric professionals, not qualified to determine whether someone showed signs of mental illness, and who, above all, are primed to claim that someone acted or seemed mentally ill, because this is the expected answer. Police reports, too, are subject to similar biases.

Furthermore, Mother Jones hadn’t explained how it was defining ‘signs of mental health problems’ and who was evaluating the data to determine whether people met the criteria. This is a classic failure of reporting; without this information, the claim being made is useless. In an academic or scientific journal, methodology would have to be presented at the start, along with the source of the data set or sets. In Mother Jones, none of this information was made available, making it impossible to evaluate the information provided; were the conclusions in the article accurate? Who knew, because everything about the methodology was in a black box.

At the end of December, Mother Jones finally released the full data set, and what I found was disturbing: the ‘mental health’ column was filled with after-the-fact observations, hearsay, and speculation, which was exactly what I’d feared when the publication released the initial article based on these data. Calling them ‘data’ is a bit of a stretch; they are indeed a compilation of words in boxes, but they varies widely in usefulness and validity. Some information is very easy to factually verify; gun make and model, for example, is readily extractable from police reports and archived press conferences.

But statements like ‘neighbors said he suffered from depression and had a drinking problem’ and ‘his brother called him ‘unbalanced’ and mentally ill.’ do not provide evidence of mental health problems. They provide evidence that people like talking to the media in the wake of tragedies, and that they will say precisely what they think the media want to hear in order to see their names and enjoy some time on camera.

Contrast these hearsay statements with more concrete evidence in the same column: ‘he voluntarily visited a a psychiatric ward. He was hospitalized at least once for suicidal tendencies and was taking Prozac’ and ‘a psychiatrist, testifying for the prosecution, said he suffered from schizophrenia’ indicate that there was a concrete history of mental health problems.

That means that the ‘data’ compiled to prove evidence of mental illness were highly mixed in nature. A hearsay statement is not equivalent to hard evidence like medical records or psychiatric evaluation; there’s a reason one type of evidence is often not allowed in court and the other is. Mother Jones cannot pretend that these data are interchangeable, because they’re not; and in either case, it still can’t prove a firm causative relationship between a history of mental health problems and rampage violence. Correlation is not causation.

If you whittle the raw data down to cases where killers demonstrably had signs of mental health problems, which I’m defining as cases where they had medical records testifying to medical illness or a psychiatrist or mental health professional testified after the fact, using available information about the patient, that 38 number drops considerably. Yet, Mother Jones chose not to use a more conservative evaluation of the dataset, because that wouldn’t have served the larger goal: claiming that mental health problems are the reason for the rising number of mass shootings in the US.

This is called a confirmation bias for a reason, and it’s as bad in investigative journalism as it is in science. If you set out with a predetermined conclusion in mind and know what you expect to find, you’re going to find exactly that, no matter what the data actually say. Mother Jones fell into a classic logical fallacy here, which is a pity, considering the breadth and depth of the study. The sloppiness with the mental health data forces me to be suspicious of all the other data in their reporting as well, as well as the quality of the investigative journalism at the publication as a whole.

The methodology for evaluating the information collected during analysis should have been clearly spelled out before beginning, and it should have been carefully developed to minimise the risk of bias. Errors of this nature should have been caught, discussed, and evaluated long before this data analysis went to publication, and the fact that they weren’t speaks poorly of the journalistic rigor at Mother Jones. I won’t trust the presentation of data without any information about methodology, and I really won’t trust it when it comes from a publication with a history of massaging data to make it fit a given hypothesis.