A 30-year-old man wearing only his underwear was lying in a fetal position in the middle of a highway in Weirwood, Virginia, when a passenger bus struck and killed him. It was 1:30 in the morning. The bus belonged to a company called Virginia Seagull Travel, which is a "curbside" bus company, meaning it picks up and drops off passengers on a street curb instead of at a traditional station.
This tragic accident, which took place on July 9, 2008, had outsized influence on the findings of an error-riddled government study comparing the safety records of curbside vs. conventional buses, which I wrote about for Reason.com a couple weeks ago. The National Transportation Safety Board, a federal agency, published the report.
Let's even grant that this particular accident could conceivably tell us something meaningful about the safety of curbside buses, even though an official investigation absolved Virginal Seagull Travel of all responsibility. With regard to this accident, the NTSB study's main sin from a statistical standpoint was the fact that it ignored that the company is a tiny outfit. Over the course of six years, Virginia Seagull Travel maintained a fleet size of just three buses and had this one fatal accident. The NTSB converted these numbers into an accident rate of 35 fatal accidents for every 100 buses. A much larger company running 1,515 buses and 21 crashes in which someone was killed was assigned a rate of 1.3 fatal accidents for every 100 buses. Ludicrously, the large company was weighted the same as Virginia Seagull Travel. By disregarding wide variations in company size, the NTSB report arrived at a headline-grabbing and misleading conclusion that curbside bus companies were "seven times" more fatal-accident prone than conventional bus companies. By analogy, as I wrote in my article, "it's as if a rookie baseball player with three at bats and one hit received the same ranking as a starter with 600 at-bats and 200 hits."
This instance of "statistical malpractice," which is how quantitative analyst and statistics expert Aaron Brown characterized it, isn't even the report's most egregious flaw. It also misclassified bus companies. Inexplicably, Peter Pan and Greyhound (among several other conventional carriers) were counted as curbside companies. Also, the NTSB press office trumpeted statistically insignificant findings, including the "seven times" figure mentioned above.
Now other reporters have taken a second look at the report and the National Transportation Safety Board has responded to several points in my original article.
Bloomberg News reporter Jeff Plungis, who covers the bus industry, has a piece out headlined, "NTSB Defends Study that Preceded Chinatown Bus Safety Sweep." Plungis writes:
The U.S. National Transportation Safety Board is defending itself against accusations it skewed a study that preceded the shutdown of 26 so-called Chinatown bus operations in the Northeast.
Anomalies in how the board classified motorcoach operators and calculated fatality rates raise doubts about its conclusion that curbside companies, including most Chinatown carriers, are about seven times more probable to have passenger deaths than companies using terminals, according to the Reason Foundation, a Los Angeles-based advocacy group for limited government.
The article goes on to state incorrectly that one of my main criticisms was that the study didn't factor in mileage when calculating accident rates. Mileage counts aren't in the raw data, so there would be no way to factor them in. Then it leaves out mention of how the NTSB distorted its findings by blending accident rates of large carriers and small ones without weighting by size.
This omission allows the NTSB to get away with making hay out of the fact that reclassifying the bus companies doesn't, as the article notes, "alter the finding that curbside operators have been more dangerous." Technically it's true that if you correct the data by moving nearly all the fatal accidents from one column to the other it doesn't dramatically change the results. That's because of the very methodological flaw the article didn't mention: The companies weren't weighted by size, so many of the accidents have a miniscule effect on the results. Aaron Brown put it best: "If virtually all your data doesn't affect your conclusion, you're doing something wrong."
Over at Next City, Matt Bevilacqua offers a pretty good summary of the NTSB report's shortcomings, noting,
In a long article published last week, Reason magazine's Jim Epstein draws attention to what he considers a number of major flaws in the study. The most glaring of these is the fact that of 37 fatal crashes cited by the NTSB, a full 30 didn't involve curbside buses at all. Rather, "conventional" bus companies, or those with stops at actual stations, accounted for most of the accidents. The well-known conventional bus service Greyhound, for instance, was responsible for 24 of the crashes.
Bevilacqua does allow the agency's press officer, Eric Weiss, to get away with claiming that I put too much emphasis on one chart: "'[Epstein] pulls one of nine charts on one page of a 66-page report," Weiss said. "That's not even one of the goals of the report, and he says that it is the centerpiece of our report.'" He's referring to the appallingly inaccurate finding that curbside bus companies are "seven times" more fatal-accident prone.
Surely Weiss is being disingenuous, since he knows full well that his organization chose to highlight this chart in its press release, which in turn inspired major newspapers around the country to repeat this absurdity either in their headlines or at the top of their stories (e.g., "Chinatown Buses' Death Rate Said Seven Times That of Others").
Anyway, as I noted in my piece, while that particular chart is particularly offensive the rest of the report is just as meaningless. If your data are wrong, your analysis is wrong across the board.
Bevilacqua also got Weiss to utter this howler in response to my charge that the NTSB reported statistically insignificant findings: "'We do not say it is statistically significant, we just put out the numbers.'"
To demonstrate the absurdity of that statement, let's assume the NTSB had looked only at Virginia Seagull Travel on the day it happened to have a fatal accident. Then it would have found that curbside buses have a 100% fatal accident rate. If that were the case, would Weiss also have just "put out the numbers?" Just "putting out numbers" without mentioning that they're practically meaningless is profoundly irresponsible.
A piece in The Weekly Standard recounts how the National Transportation Safety Board denied my request for the study data:
As [Epstein] was researching his piece, he asked the NTSB to share the data. They did not respond. So he filed a formal FOIA request for it. That was ignored, too. In particular, Epstein was interested in a chart showing the relative accident rates and confidence intervals. The NTSB told him that such a chart did not exist.
So my employer had to purchase bus accident data from a federal contractor and I had to rebuild the chart myself. Shortly after my article came out, the NTSB released the "nonexistent" chart.[*] Eric Weiss apologized to me via email, citing a "misunderstanding." As The Weekly Standard notes, "If you find that infuriating, imagine how it must feel to have been involved in one of the 27 curbside companies that was shut down" after the flawed report was published.
[*] The chart is hard to find on the NTSB's website, so I've reposted it here. It fully substantiates my case against the study. The NTSB's numbers do differ slightly from mine because the federal dataset they're drawn from is updated retroactively on a daily basis. I accessed the data about a year after the NTSB, so my numbers are more up to date.