Policy

Have We Found Gun Laws That Could Reduce U.S. Gun Deaths by Over 90 Percent? Not Really

Lancet study is far from proving its case, and highlights the difficulties of using statistical analysis to lead to causal conclusions about laws' effects.

|

A new study published in the British medical journal The Lancet last month comes to a startling, and pretty ridiculous, conclusion: that nationwide application of just three gun laws all at the same time would reduce American gun mortality by well over 90 percent. (They predict those laws could get our national firearm death rate down to 0.16 per 100,000 from 10.1 in the year they studied.) That's a level lower than we've ever seen as far as gun death statistics reach.

: ~Steve Z~ via Foter.com / CC BY-SA

That conclusion (as is likely the point with studies like this) was a real headline-maker, with CNN declaring "Study: 3 federal laws could reduce gun deaths by more than 90%"; Ars Technica alerting us "Three laws could cut US gun deaths by 90%, study says"; and PBS asking "Could these 3 laws reduce gun deaths by 90 percent?"

The study was written by Bindu Kalesan (of Boston University Department of Medicine), Matthew E Mobily (Epidemiology Department at the Mailman School of Public Health), Olivia Keiser (Institute of Social and Preventive Medicine at the University of Bern), Jeffrey A Fagan (Columbia Law School and Department of Epidemiology), and Sandro Galea (Boston University School of Public Health).

The researchers looked at just two years of gun death data, 2008 and 10, derived from the Centers for Disease Control and Prevention's Web-based Injury Statistics Query and Reporting System. Data for the years before and after were available for examination had they cared to do so to check the robustness of their conclusions.

The authors examined 25 firearm state laws relevant to firearms in place in 2009, and controlled for "state-specific characteristics such as firearm ownership for 2013, firearm export rates, and non-firearm homicide rates for 2009, and unemployment rates for 2010."

That rather random selection of years for data or estimates which theoretically were available for the actual years they studied is also interesting and a little peculiar, other statistical reseachers have told me.

I asked Dr. Kalesan about the particular choices of years to study and other variables to control for, which seemed rather random to this layman. Her emailed response: 

This dynamic state of multiple laws makes it very difficult to obtain annual changes quantitatively equally between different states. This is the reason that we chose one year of gun laws (2009 laws) and two years of firearm fatality rates (the 2008 and 2010 firearm fatality rates). We consider year 2009 to be a momentous year, due to passing of the first radical pro-firearm legislation, where the Obama administration allowed carrying of guns in national parks, replacing the Reagan administration law that prohibited open carrying in national parks and wildlife refuges. This year represents the beginning of a turning point in the strengthening of pro-gun legislation in the US. Since we used only a slice of the data, we restricted the study to specific covariates and did not use a large number of social variables that may also pose statistical issues such as multicollinearity. 

Let's walk through what the study claims to have found, and how it claims to have found it.

Most state level gun laws, they found, do not seem to be associated with reduced firearm mortality, with 16 of the 25 laws they examined either seeming to have no discernible effect or to increase firearm mortality (more on which later).

But the headline grabbing part is they insist that three of the nine laws they claim do reduce firearm deaths do so in an almost shockingly huge way, such that:

Projected federal level implementation of universal background checks for firearm purchase could reduce national firearm mortality from 10.35 to 4.46 deaths per 100,000 people, background checks for ammunition purchase could reduce it to 1.99 per 100,000, and firearm identification to 1.81 per 100,000.

That's amazing, especially since the first thing a layman might think is, hm, if those laws would have that effect nationally, should it stand to reason they they could be seen to have had that effect, or something close to it, in the states where they actually exist, and if it is a genuine effect they'd continue to do so in years beyond the small handful the researchers looked at?

That seems to not be true.

Looking at 2014 gun death data for states (using their own classifications) that have universal background checks (which they say nationally would lead to a 4.46 death rate per 100,000 people), we find that of those seven states, only two of them had a gun death rate lower than that—Hawaii with 2.6 and Rhode Island with 3.0—and that two of them, Pennsylvania and Maryland, had gun death rates more than twice that.

To be more sure of causation, one ought to go back and see what their gun death rates were before the background check laws were passed. Not to mention the likely overreach involved in such a bold conclusion from a mere seven states, five of which have now higher death rates than they claim those laws would deliver to the nation.

Their even bolder claims about background checks for ammunition laws and firearm identification (meaning ballistic fingerprinting or microstamping of guns) look even more obviously unlikely. There were only three states with the former on which they based their wild claim: Illinois, Massachusetts, and New Jersey. Their respective gun death rates, those three states with laws this study asserts would, if imposed nationally, bring down the national gun death rate to 1.99?

For 2014 (again, not a year they studied, but if those laws have causative effects we should see them moving forward as well) 9.0, 3.2, and 5.3 respectively. (Again, no looking at numbers and trends  before those allegedly powerful death-reducing laws passed, and no wondering if maybe three states aren't a big enough sample on which to base their headline-grabbing conclusion.)

And what about those firearm identification laws which would, if imposed nationally, allegedly lead to a 1.81 gun death rate? Again only three states have them, in this case Maryland, New York, and California. And those states respective 2014 gun death rates were 9.0, 4.2, and 7.4. All quite a bit higher than 1.81.

According to the generally pro-gun control researcher Daniel Webster, director of the Johns Hopkins Center for Gun Policy & Research, those states weren't even enforcing those "firearm identification laws in a meaningful way in the years this study looked at."

Dr. Kalesan in an email acknowledged that the question of law existing vs. law enforced is "a chronic challenge to this type of work….However, the passage of a law is associated with a range of hard-to-measure actions on the part of law enforcement, many in anticipation of the law being fully implemented, that in and of themselves may well result in changes, even if not directly linked to the law itself."

She and her co-authors believe it is perfectly plausible to assume that anticipation on the part of would-be killers or self-killers that in the future a new handgun they buy would be microstamped has enormous effects on how many people die from guns.

The study itself says: 

We showed that federal-level implementation of the three most strongly associated laws— universal background checks for firearm purchase, background checks for ammunition, and requiring firearm identification by either microstamping or ballistic fingerprinting—would substantially reduce overall national firearm mortality…..On the basis of our model, federal implementation of all three laws could reduce national overall firearm mortality to 0·16 per 100000.

That's down from 10.1 per 100,000 according to their own study, an amazing reduction of well over 90 percent. That sounds like a causal claim to me, but in an email despite the above language of the study itself Dr. Kalesan said that "We specifically state that this is a cross sectional study. It does not imply causation." 

To the lay reader, those sort of conclusions all seems preposterous on its face. But these are scientists, doing statistical analysis, in a very famous scientific journal. Even this layman realizes that if there other influences that increased gun deaths beyond what they otherwise would be, that could explain why we don't actually see those death-reducing effects in our complicated confusing reality.

Mykl Roventine via Foter.com / CC BY-NC-SA

"The question of what is or is not 'plausible' is ultimately an empiric question," Dr. Kalesan wrote when I asked, well, doesn't this seem just utterly implausible? (Other pro-gun-control researchers feel the same, such as David Hemenway of Harvard's School of Public Health who told the Washington Post that "That's too big—I don't believe that…These laws are not that strong. I would just be flabbergasted; I'd bet the house if you did [implement] these laws, if you had these three laws and enforced them really well and reduced gun deaths by 10 percent, you'd be ecstatic.")

Kalesan writes:

….it strikes us as plausible indeed that background checks for purchase of ammunition—effectively making sure that we limit ammunition in the wrong hands— may in fact substantially reduce the gun death rate. In line with this, we found a somewhat smaller reduction in gun death rates for background checks for firearm purchases. The cumulative effect of three different laws, where we currently have at least effective federal laws, may very well be possible…..

However, we are very careful in the paper to note that we believe that this is a long-term effect and thus will take several years to occur after passing and implementation of the laws. We note in the paper: "Assessment of the effect of legislative policies is akin to assessment of the effect of natural experiments or real-world data. We expect the fall in mortality to be a long-term effect and might take years to occur."

In Reason's February issue I wrote a wide-ranging survey of how gun social science applies to gun policy debates, "You Know Less Than You Think About Guns."  The authors of this Lancet study are, as I was in that story, dubious of attempts to figure the effect of gun laws on gun deaths that rely on "an arbitrary legislative strength score" or "the effect of a select few laws." 

They looked at 25 different laws separately then, and found nine that they conclude have some association with reduced firearms deaths:

state licence to sell firearms, keeping and retaining of sales records, at least one store security precaution, firearm identification, reporting of lost or stolen firearms, universal background checks for all firearms, safety training or testing requirement to purchase firearms, law enforcement involvement in obtaining of permits, and background checks for the purchase of ammunition.

The same techniques find that nine other common gun laws are associated with increased firearm deaths. In most cases, the causal connection that leads these laws to increase gun deaths would require some imagination to guess.

Obviously most of the laws listed below are obviously intended to increase gun safety, such as the specific closing of the "gun show loophole" (that is, applying background check laws to gun shows but not all gun sales) and "assault weapon bans."

Still, here are a list of gun laws the study associates with increased gun deaths:

a requirement for the dealer to report records to the state for retention, allowing police inspection of stores, limiting the number of firearms purchased, a 3-day limit for a background-checks extension, background checks or permits during gun shows in states without universal background-check requirement (ie, closure of the gun show loophole), integrated or external or standard locks on firearms, a ban or restrictions placed on assault weapons, law enforcement discretion permitted when issuing concealed-carry permits, and stand-your-ground.

Not that this would be warranted either, but I saw no headlines crowing that social science in a prominent journal has proven that giving cops a say in issuing concealed carry permits, assault weapon bans, and laws limiting numbers of firearms purchased has been "proven by science" to lead to more gun deaths.

Do the statistical techniques used in this Lancet study support any actual reasonable confidence in the conclusion? Can it possibly be the case that background checks for ammo and microstamping would reduce our gun homicide rates to one lower than we pretty much have ever seen in any state anywhere ever?

Aaron Brown, author of Financial Risk Management for Dummies who uses statistical analysis in his work, examined the study at my request and found some interesting wrinkles, to start with that by their measures a combination of suggested gun laws would result in a negative gun death rate. Brown also notes that we have had federal background check laws, and wonders if Kalesan et al. are on to something, "why did we not see the predicting soaring gun death rate when the laws were enforced?"

Brown found lots of analytical curiosities that should have given the researchers pause. For example, "Why would record keeping and retention rules cut gun death rates by 21% in Alaska, but raise them 66% in Florida (the 95% confidence intervals are listed as reduction of 15% to 26% in Alaska, and an increase of 47% to 88% in Florida). Essentially all the rules have wildly varying effects by state in their model, this is not even an extreme example. So not only is it absurd that these minor rules could make such huge differences, but it's doubly absurd that they have wildly different effects in different states."

On the microstamping point, Brown, like Johns Hopkins' Webster above, notes that the laws weren't even in practical effect during the years the researchers looked at, but if they were, given that the three states supposedly with the laws "comprise 20% of the US population, and their gun death rates obviously wouldn't be affected by a federal law duplicating their (alleged) state law, there would be a national rate of 1.4 per 100,000 even if the other 47 states and the District of Columbia go their rates down to zero."

Another of the many problems with the study, Brown notes, "is they have only 51 states and they use 29 independent variables. What that does is make every variable look highly significant, but in mixed directions. That's why they find such gigantic and highly statistically significant effects in both directions, from laws that couldn't have anywhere near those impacts."

Brown provided a laymen's example analogous to how the Lancet researchers tried to prove their point.

A guy claims, "Women are all worse drivers than men." You say, "What about Danica Patrick who was third in the Indy 500?" He replies, "Except for Go Daddy girls." You say, "What about Aubrey McClendon who drove his car into an embankment?" He comes back, "Except for energy company CEO's." Now a few of these kinds of qualifications might be reasonable, but suppose there are only 51 drivers, and it takes him 29 qualifications before you can't think of any women who drive better than men, adjusting for his qualifications. Obviously there's no information there.

Looking at so many different laws at once allows the researchers to believe that certain laws would have had amazing gun death reducing effects by simultaneously assuming that other ones had improbably amazing gun death increasing effects. As Brown notes:

It's the multivariable problem. Let's say you checked only one law. 25 states have the law and have 9 gun deaths per 100,000, and the 25 that don't have 10 gun deaths per 100,000. You might guess that the law cuts gun deaths by 10%. It wouldn't be proof, maybe low violence states pass more gun laws rather than the other way around, or maybe it's random noise, but it does suggest that the law might work. You can't get too crazy with one variable.

But let's say you checked 50 laws instead of one. You could come up with a model that predicted every state's gun death rate exactly, by saying some laws had a gigantic positive effect, and some a gigantic negative effect. You'd just choose the effects carefully so they offset in every state. But if you the picked the three biggest negative effects, they would predict that passing those three laws and no others would reduce the gun death rate far lower than any one actual state.

The overarching problem is that this sort of statistical analysis cannot in a certain sense prove what these sort of studies think they are proving minus some understanding of human beings and their incentives and some plausible story about how certain causes might have certain effects.

You can observe, Brown explains, that "states with some particular gun law have lower gun death rates. That suggests the law might reduce gun deaths….Now a statistician will ask, how many states have the law? If it's only one state, there's a 50% chance it would have a below median gun death rate even if the law makes no difference. So it's no evidence at all. Two states is still a 25% chance. Only if you have five states (1 chance in 32 if the law makes no difference) or more and all have below median gun death rates do you have a conventionally significant result (the probability of the result is less than 5% if the law doesn't affect gun death rates)…."

But the raw statistical fact of statistical significance doesn't tell you much causal except that you've observed the correlation of law and result successfully. But there is much more work to be done, because you don't know if the law caused the lower death rates, or if, maybe, places that have low gun death rates are more likely for some reason to pass such laws. You'd need to check what happened both before and after the laws were passed, which this study doesn't do. And you'd need a plausible story by which the correlation could imply causation.

"This study in particular has no logical case," Brown says. "They show that if you assume some gun laws cause big increases in gun death rates and come cause big decreases, and which laws do which vary by state, you can choose numbers so that your predicted gun death rates match the data pretty well."

This study was pretty widely shellacked even from a community of researchers who staunchly believe in the importance of tough gun laws. Besides the critiques from Webster and Hemenway quoted and linked above, another Johns Hopkins Center for Gun Policy and Research faculty member (an institution generally pro-gun control), Cassandra Crifasi, bluntly told PBS: "To put it lightly, I was surprised to see a paper of this quality, or lack of quality, in a journal like the Lancet.".

But I suspect for many the news headlines crowing about gun laws and 90 percent cuts in gun deaths and scientific studies in reputable journals accomplished what I'm always darkly suspicious is the real point of "social science" like this: to get it stuck in the heads of people only half-paying attention that "science has proven" that we need tougher gun laws and that they'll do only and amazingly great things. But it is simply not reasonable to believe that it's true or close to true.

I critiqued last year an earlier shoddy study from the American Journal of Public Health claiming universal background check and permit requirements to legally obtain a gun in Connecticut proved such laws can or will reduce gun homicides by 40 percent.