5 Comments

If a law enforcement agency takes one thin dime from the Federal Government, they should be required to report crime data. This would allow an agency to not accept Federal money if they consider the reporting too onerous…

Expand full comment

Suggestion. Jeff-alytics has highlighted the homicide decline to date in 2024 in various American cities. A helpful analytic tool might be to provide an index for each city in each year based upon the value from 1960. 1960 was the baseline for many cities before they began a multidecade crime spree. Yearly changes would then be in relation to this baseline value.

For example, one of the largest percentage declines we see for an American city with many homicides is Philadelphia. Its homicide rate/count has declined by ~40% YTD. Yet, it is still double its baseline rate from 1960!! -- so Philly would be given an index value of 200. New York City, however, currently has an index value of ~80, so its yearly change could be understood in terms of this index. This perspective becomes even more meaningful when we consider a city such as LA. LA is now near to its baseline value (i.e., 100).

By thinking in terms of these baselines and the current index we can form a much better idea of where the equilibrium value might be. Ignoring this perspective then results in cities such as LA with low index values seeming to be doing worse than other cities with high crime rates that are merely returning to their baselines. Perhaps a weighted average by population could also be constructed so that there is a measure of how much we have been able to push back the crime wave that occurred from 1960 to ~1995 (probably caused by lead).

Expand full comment

You have one of my favorite substacks to read.

Expand full comment

Excellent analysis. Clear and data driven. Thank you.

Expand full comment

Thoughtfully put and detailed. Thank you!

Expand full comment