Why Most Dangerous/Safest Cities Rankings Are Bunk
A new whine for a new list.
New Orleans is the most dangerous city in the United States, at least that's what you'll hear if you ask Wallet Hub. The app that “helps you make the most of your money” came out with a list of places where you can feel the safest with your money.
I often find lists like this are overly complicated, prone towards conflating data reporting errors with “safety”, and fail to adequately contextualize the data. This list from Wallet Hub, however, is of course no exception.
To begin with, the list purports to “determine where Americans can feel most secure — in more than one sense — WalletHub compared more than 180 cities across 41 key indicators of safety.”
Their formula is easy to understand:
In order to determine the “safest” cities, the analysis takes 41 metrics across three broad categories: Home & Community Safety (17 metrics, 60 points), Natural Disaster Risk (6 metrics, 20 points), and Financial Safety (18 metrics, 20 points). Each of those 41 metrics are graded on a 100-point scale with 100 being the best. Also, weights are added to each metric for good measure.
The result is a ranking of 182 cities from “safest” to…not.
A general list of sources is provided, but — as is usual with lists like this — the sources just tell us a bunch of organization they got data from with no easy way to check the math. They mention the FBI, for example, but don’t say what year of crime data they’re using and how they handle missing or incomplete data.
For me, the biggest blinking red light that something doesn’t add up in WalletHub’s rankings is that Hurricane Storm-Surge Risk Index Score carries the same weighting as the Hail Risk Index Score. Hail storms can be bad! But they are not city-altering.
New Orleans is among the most at-risk places for natural disasters, a risk every New Orleanian acknowledges every June through November. We haven’t gotten a second dog (or a cat) because we don’t have the space in the minivan to evacuate with another animal — that’s a pretty good indicator about the fear/risk of natural disasters there.
I know it seems like I’m here to tell you that my hometown of New Orleans is not the most dangerous place in America, but I am extraordinarily wary of any list that ranks New Orleans 71st out of 182nd in terms of natural disaster risk. Any ranking that puts Shreveport as a riskier place than New Orleans for natural disasters is immediately suspect.
Going through the Home & Community Safety metrics raises some questions. The heaviest weighted metrics are terrorist attack over the last 10 years and murder rate. It’s unclear to me why terrorist attacks — events that can be poorly defined, extraordinarily rare, and unlikely to be repeated in the same place — are weighted so heavily as a determinate of community safety. Is a city that had a terrorist attack in 2016 really less safe now?
Murder rate at least makes sense as a metric for evaluating “safety”, but any reliance on FBI crime data is going to have issues. It’s already dated, so if you want to rank cities by most/least dangerous in 2025 or 2026 you inherently are using 2024 data at best.
New Orleans is particularly problematic in this situation because New Orleans did not report data to the FBI in 2024. If you decide to use 2023 data for New Orleans then you are using a substantially higher murder rate (53 in 2023 vs 33 in 2024 vs even lower in 2025) which ignores the massive drop in murder there over the last 3 years. New Orleans had the 3rd highest murder rate of the cities ranked using 2023’s rate, but the city had the 9th highest using 2024’s rate.
Jackson, MS probably had the nation’s highest murder rate for any city of over 100k in 2024, but Jackson has not reported data to the FBI since 2019 when the murder rate was considerably lower. There are 15 cities in the rankings that didn’t have 2024 data or don’t have a dedicated police department reporting data (Columbia, MD, for example, is reported as part of the Howard County Sheriff’s Office).
Next is the rate of assaults. This seems straightforward, but whether this means aggravated or all assaults (simple + aggravated) is not said. Reviewing the data strongly implies that they mean aggravated assaults, but the list of safest cities doesn’t exactly match the FBI’s rates.
Pembroke Pines had the second lowest aggravated assault rate in 2024 but comes in at 4th lowest on WalletHub’s list. Savannah, Georgia and Worcester, MA had the lowest rates but aren’t ranked at all by WalletHub.
The other problem with ranking cities like this is that what gets counted as an aggravated vs simple assault can vary wildly by city. Virginia Beach reported more than 28 simple assault for every aggravated assault in 2024 while Oakland reported a nearly 1 to 1 ratio.
Memphis did indeed have 52 times the rate of aggravated assaults as Virginia Beach did in 2024, but Memphis only had 3 times the simple assault rate. This doesn’t mean that either city is inherently reporting more correctly, but it does drive home how problematic some of these numbers can be. There is likely a difference in how these two cities classify offenses that probably does not reflect the level of “safety”.
This is exactly why the FBI is so insistent that people fight the urge to rank cities. From the FBI:
“Each year when Crime in the United States is published, many entities—news media, tourism agencies, and other groups with an interest in crime in our Nation—use reported figures to compile rankings of cities and counties. These rankings, however, are merely a quick choice made by the data user; they provide no insight into the many variables that mold the crime in a particular town, city, county, state, region, or other jurisdiction. Consequently, these rankings lead to simplistic and/or incomplete analyses that often create misleading perceptions adversely affecting cities and counties, along with their residents.”
The WalletHub ranking also uses rape and theft as crime categories but apparently does not use robberies, burglaries or motor vehicle thefts for some unknown reason.
Moving on, law enforcement employees — not officers — are included but it’s not clear whether that is a sign of safety or lack thereof. A higher rate of officers is generally correlated with a higher violent crime rate. Dinging Irvine, CA for having the third lowest officers per capita (cities 100k+ in 2024) doesn’t make a ton of sense.
Three other Home & Community Safety metrics stand out as particularly egregious inclusions. The first is including hate crimes. If you’re familiar with my analyses, you may recall that I have thoughts about hate crimes data (expressed previously here, here, and here.) Reporting can be incredibly spotty and varied between departments and years, and there is never a way to know the extent to which more reported hate crimes reflects better community reporting to police, better reporting by police to the FBI, or more actual crimes.
Another flawed metric that stands out is Perception of Safety. This is defined as being “based on perceptions of visitors of Numbeo website in the past 3 years. If the value is 0, it means it is perceived as very low, and if the value is 100, it means it is perceived as very high.”
New Orleans receives millions of visitors every year, but Numbeo’s perceptions of crime in the city is based on 147 contributors who agreed to fill out a survey on crime presumably over the last 3 years. And this metric is given equal weight in determining how safe a city is with assaults and thefts.
Then there’s the Share of Sheltered Homeless which is only given half weight. What is the degree of “safety” provided by a city with a small, mostly unsheltered homeless population versus a city with a larger, mostly sheltered population?
I have issues with other metrics, but those are the ones that stuck out the most. I could dig deeper into the Financial Safety category, but it is just window dressing as the Home & Community Safety largely determines a city’s overall “Safety” ranking, especially in the 100 or so cities with the most crime.
Ultimately what we are left with is an impossibly complex web of flawed individual metrics that say little about these cities other than that the cities reporting more crime are considered less safe than the cities with more crime.
Lists like this drive home why I prefer to compare a city’s trends against itself rather than other cities. Is New Orleans “safer” than Memphis or Oakland or St Louis? That’s an arbitrary and impossible to answer question. Is New Orleans or Memphis or Oakland or St Louis getting safer? That one we can answer pretty confidently.
Which city is the safest or least safe in America is simply an unanswerable question and lists like this one use complexity of adding more metrics to obfuscate this fact in my humble opinion.
New In The Jeff-alytics Podcast
I talked with immigration policy expert Dara Lind about the connection (or lack thereof) between immigration and crime. We chat about what people tend to get wrong and the criticality of historical context for understanding the present moment. Check it out below or wherever you get your podcasts!
And be sure to check out other recent episodes:
Atlantic County Prosecutor William Reynolds
Former Bureau of Justice Statistics Director Alex Piquero





I also object to trying to rank an entire city. Houston can take almost an hour to cross in an automobile. Crime rates vary wildly depending on where you are. Years ago I studied the local police crime map in my neighborhood, and realized that the crime rate was easily double if you were within 2 blocks of the freeway, or certain cheap apartment complexes in my area. Assigning a single number to something very complex is just silly.
Law enforcement officer numbers depend on a number of factors of which the crime rate is only one and not the most determinative at that.LEO numbers depend on how many officers the jurisdiction can afford, what the priorities of the governing body (city council, etc.) may be, and, sometimes, what the public will tolerate. I suspect Irvine has the lowest number of officers because that's all they can afford.
Hate crimes shouldn't even be considered, because it necessarily includes other crimes already reported. The concept of "hate" crimes depends on accepting the premise that an aggravated assault, for instance, is somehow more egregious because of the motivation of the perpetrator as opposed to the perpetrator's actions.