A few weeks ago, I came across research claiming to rank transit systems by state and by levels of safety. I usually enjoy studies like this, but something felt off. The rankings were based on state population rather than transit ridership, a strange choice, since state population doesn’t necessarily correlate with how much people use public transit. Ridership, on the other hand, gives a much clearer picture of the scale and risk profile of a transit system.
Since ridership data is easily accessible to anyone looking at safety statistics, I can only assume that the authors either overlooked it or chose to ignore it—carelessly or intentionally skewing the results.
The study came from a law firm in Atlanta, Georgia, Foy and Associates.
Media outlets across the country picked up the story, including the New York Post and MyNorthwest (KIRO). Both articles followed the same formula: cite the study’s rankings, then list specific examples of dangerous incidents on buses and trains to reinforce the findings.
The problem? The “study” itself wasn’t publicly available. Neither article linked to the original report, nor did they provide detailed citations, just vague references to federal data. When I visited the law firm’s website, I couldn’t find the report there either.
I emailed the reporter at MyNorthwest but haven’t heard back.
The media coverage did include a “Top 10” list of the most dangerous states for transit safety, along with some supporting numbers. So I worked with what I had.
When I re-ranked the data using ridership instead of state population, the results changed significantly. This is the source of the data I used to come up with by-state ridership (see what I did there?)
Rank per resident | Rank per rider | Variance | State | Violent incidents | Fatalities | Injuries | Fatalities + Injuries | Occurrences per 100,000 people | Ridership | Occurrences per 1,000,000 riders |
3 | 1 | 2 | Minnesota | 309 | 6 | 316 | 322 | 11 | 60,573,551 | 5.32 |
2 | 2 | 0 | Illinois | 817 | 6 | 845 | 851 | 13.3 | 354,333,670 | 2.40 |
8 | 3 | 5 | Texas | 412 | 14 | 404 | 418 | 2.7 | 206,667,280 | 2.02 |
9 | 4 | 5 | Arizona | 95 | 2 | 95 | 97 | 2.6 | 59,021,226 | 1.64 |
5 | 5 | 0 | Pennsylvania | 314 | 6 | 316 | 322 | 4.9 | 261,503,764 | 1.23 |
7 | 6 | 1 | Maryland | 94 | 5 | 91 | 96 | 3.1 | 80,880,788 | 1.19 |
4 | 7 | -3 | Massachusetts | 280 | 1 | 287 | 288 | 8.1 | 262,844,430 | 1.10 |
6 | 8 | -2 | California | 852 | 31 | 849 | 880 | 4.4 | 835,540,484 | 1.05 |
1 | 9 | -8 | New York | 1,641 | 23 | 1,759 | 1782 | 17.5 | 3,085,846,331 | 0.58 |
10 | 10 | 0 | Washington | 97 | 5 | 95 | 100 | 2.6 | 187,833,176 | 0.53 |
Here is the spreadsheet I worked from.
For example, New York (touted as the most hazardous in the media coverage) drops from first to ninth when you account for ridership. Texas and Arizona each jump five spots, moving from the bottom half to the top half of the rankings. And that’s just using the limited data that was made public. Without knowing the original source data or methodology, we can’t even say if states outside the top 10 might belong in it under a more reasonable metric. So even Washington’s stable placement “in the top 10” is suspect.
So what do I make of all this?
I think it’s a troubling example of using data to support a predetermined narrative: that public transit is unsafe. I understand why an Atlanta-based law firm would produce content like this. It’s a form of content marketing that positions them as experts, especially if you’re considering a lawsuit over a transit-related injury.
But even media outlets with a specific editorial angle should be more responsible. Passing along research without showing the notes (and failing to ask basic questions about methodology) isn’t just lazy journalism. It’s misleading, and it undermines public understanding of real transit safety issues.
Leave a Reply