I occasionally get suggestions for this newsletter from readers and well-wishers. Most of them don’t work as potential topics — usually either because the general idea is wrong, the data isn’t available to write about, or the idea doesn’t fit the general theme of my newsletter and writing.
Every once in a while, however, I get a great idea that is both worth exploring and can be reasonably explored with existing data.
Such was it when Oren Gur, Policy Advisor and Director of Research and of the Philadelphia District Attorney's Transparency Analytics (DATA) Lab, suggested a theory about clearance rates. Gur texted me one night many months ago, saying “What I’m thinking is that if the number of homicides is high for several years and then goes down, based on how clearance rate is calculated, the clearance rate in following years will be very high. Because they had many open cases from the previous years, but the denominator is now low.”
It immediately seemed like a potentially interesting topic.
Taking a step back, clearance rates are a proxy measure used by police departments and the FBI to evaluate the share of crimes ending in an arrest. The FBI defines clearance rates as:
A law enforcement agency reports that an offense is cleared by arrest, or solved for crime reporting purposes, when three specific conditions have been met. The three conditions are that at least one person has been:
Arrested.
Charged with the commission of the offense.
Turned over to the court for prosecution (whether following arrest, court summons, or police notice).
Offenses can also be cleared by exception, meaning the law enforcement agency has identified the offender, has enough evidence to make an arrest, knows exactly where the offender is, but cannot arrest them for reasons outside the agency’s control. In a murder-suicide, for example, the case is cleared by exception because the offender is deceased.
There’s one other component to clearance rates to consider that makes them a potentially misleading statistic. The denominator in the equation is the number of offenses that are reported in a given year (say there were 78 murders in a city in 2024) while the numerator is the number of offenses cleared in that year by arrest or exception regardless of when the offense occurred. A theft on December 31st, 2023 that gets solved the next day is a 2023 offense and 2024 clearance.
This leads to weird results sometimes, like Costa Mesa (CA) reporting a 1,700 percent murder clearance rate in 2020 (2 murders, 34 cleared). Or Slidell (LA) technically reporting a -400 percent clearance rate in 2014 (-1 murder, 4 cleared).
With the weirdness of clearance rate calculations in mind, let’s hypothetically oversimplify and say an agency has the resources dedicated to solve around 50 percent of murders that occur in a year, 10 percent of the previous year’s murders, and a handful of murders older than a year that we’ll call a constant of 5 old murders per year.
Now let’s take this hypothetical city and apply what has essentially happened in the US in terms of surging and falling murders since 2020. Murder surged in 2020 and has fallen steadily ever since to roughly the same level that it was in 2019.
In this hypothetical the agency’s murder clearance rate would fall from 70 percent to 60 percent the year of the surge before increasing steadily as murder falls until it’s 74 percent several years later. In this scenario, the clearance rate has gone up, but the agency hasn’t gotten any better at solving that year's crimes. The agency simply has more older murders to solve in the numerator to compare to a falling count of murders in the denominator.
This hypothetical seems to match what we’re seeing in Philadelphia with murder at a decade low and the police department clearing 70 percent of murders through October.
And Baltimore as well where murder is down 24 percent while BPD is reporting a large jump in the city's murder clearance rate.
The impact of old murders on clearance rates can be seen in NYPD data for 2023 and 2024 where an identical 45 percent of current year murders were cleared through September of each year, but a larger number of older murders were cleared last year than this year so far leading to a slightly higher overall clearance rate last year.
The real world isn’t quite as simple as the hypothetical world, but there’s evidence that cities with falling murders tend to see increasing murder clearance rates. To show this, I gathered data from cities that had at least 20 murders in a given year and reported both murder and clearance data for that year and the following year between 2000 and 2023.
Cities that saw murder decline tended to see increasing clearance rates while cities that saw murder increase tended to see decreasing clearance rates over that span. Just about 60 percent of cities that saw a decline in murders from one year to the next between 2000 and 2023 also had an increase in murder clearance rates. Most cities that had increasing murders over that span had declining clearance rates from one year to the next.
The overall relationship isn’t hugely strong for each individial agency — there are plenty of example of places with decreasing murder and decreasing clearance rates like New York above. So while fewer murders doesn’t inherently mean a higher clearance rate, it makes sense that rising or falling crime nationally would be a big factor in the overall national clearance rate trend. But other factors — such as an agency’s dedication of resources — can also undoubtedly play a role as well.
It’s not particularly surprising in this light that clearance rates rose for nearly every crime type in 2023. The only exception to rising clearance rates was motor vehicle theft which is also the only crime type that saw an increase in reported offenses nationally.
Clearance rates are the formal official statistic used to measure a police department’s effectiveness in solving cases, but it’s not great analytically because the formula is set up in such a way as to be potentially misleading.
We built a dashboard for the New Orleans City Council measuring the arrest rate of murders and shootings in New Orleans which I think does a better job of measuring the department’s effectiveness. The arrest rate takes the number of offenses with an arrest that took place in a given year divided by the number of offenses in that year.
It’s not perfect because it won’t count that murder-suicide and it doesn’t give credit for work done this year to solve last year’s case, but it does a much better job as a proxy of effectiveness over time. And the news for New Orleans is very positive as more than 53 percent of fatal shooting incidents have an arrest this year, the highest since we started tracking the data in 2011 (though obviously there’s lots of room for improvement!).
Clearance rates are a statistic that can be useful but also can be manipulated — intentionally or accidentally. An apparent increase in effectiveness can just as easily be explained as a change in workload. That doesn’t mean clearance rates are useless, but conclusions about what a change in clearance rate means should be approached critically given the inherent flaws in the underlying data.
Thank Phil and thanks so much for all of your work on this topic!
I'd be curious to know the degree to which standardization of data reporting played a role in falling clearance rates. The first state UCR programs didn't come about until the early 1970s and they weren't widespread until the 1980s. I'm not sure how one would show that relationship other than temporally, but I've been curious as to whether it played a role.
Good stuff as always, Jeff!