5 Comments

Hi Jeff: The overall issue is not more or less crime reporting via BJS, it's crime reporting itself. The overwhelming majority of violent and property crime not being reported is substantiated by BJS, Statistics Canada, The Office of Juvenile Justice And Delinquency Prevention and BJS's report on identity theft. I'm fairly sure that if I went to the UK crime survey (or other nation's crime surveys) it would say the same thing.

My issue is not reminding people that reported crime is not a measure of overall (or real) crime. If the overwhelming percentage of crime is not reported (BJS-identity theft-7 percent reported-OJJDP-juvenile crime-25 percent reported), then we should not discuss "crime" in absolute terms when it's more than possible that either increases or decreases (except homicides) can be profoundly misleading. We could have a 5 percent increase or decrease in reported FBI data when the opposite could be true.

All of this was discussed over 50 years ago when the USDOJ instituted the National Crime Victimization Survey. Yet it's interesting that the latest NCVS offers huge increases in violent crime through two reports (overall crime and juvenile crime). Yet their report is ignored by 99 percent when discussing crime.

I fully support your work and the efforts of others to create a real-time overview of reported crime data. It's a solid step in the right direction.

But by anyone's objective analysis, we are not discussing crime, we are discussing reported crime and only God knows where that takes us. Ethically, we need disclaimers stating that reported crime "could" be wildly inaccurate and we should be offering links to the NCVS so accurate comparisons can be made.

All of this is being discussed in a political light which is unfortunate. The data is what the data is. After the huge increase in violent crime from the NCVS, it's inevitable that violent crime will be significantly less in their next report, so partisans will always be disappointed by data that doesn't swing their way.

The same regression effect is currently playing out via homicides after 30-50 percent increase depending on what data you use.

So to those of us who review the work of others, including the media, who are claiming huge reductions in crime requires some context because the claims (one way or another) could be significantly off.

Beyond disclaimers and context, I do not have a quick answer as to the proper and ethical way to proceed. Maybe reported crime should be described as a possible trend based on reported numbers but it needs to be offered as "reported" data, not actual crime numbers. One possible solution would be using NCVS data at the local level which BJS is currently exploring. Maybe we use AI to weigh a variety of variables (i.e., polling data) to create a crime severity index like Statistics Canada.

If this was medical research, we would never solely use reported numbers. We would not be making medical decisions based solely based on reported numbers without endless context that reported numbers could be wrong (or significantly wrong). Yet we do it for crime policy.

Again, I believe that all of this was discussed over 50 years ago with the beginning of the National Crime Victimization Survey and I'm barely scratching the surface of the problems with reported crime data.

Yet I'm under no illusion that anything will change because people want simplicity. If a reporter asks me if crime is up or down, they want a five second answer, not an overview of the possibilities. A common reporter question, "what the hell is the National Crime Victimization Survey"?

Thanks for "listening." Len.

Expand full comment

"I couldn’t find a margin of error for 1994, but the percentage reporting was lower then than it was in 2022 (which is remarkable considering how high the violent crime rate was nationally 30 years ago!)."

While it might it seem unexpected at first glance that reporting would be lower during a crime spike, I don't think it should be. During spikes, resources don't usually increase commensurately with the increase in crime. Moreover, people can see that public safety capacity is stretched as indicators like 911 response times and case clearance rates worsen. That might understandably cause some to decide that reporting crime is not worth the effort.

Expand full comment

Hi Jeff - Mark Pope with RTI International, there is a dashboard on BJS’ website that we helped develop that should give you the margin of error for all years. See https://ncvs.bjs.ojp.gov/quick-graphics

Expand full comment
author

Thanks Mark, this is great. Much obliged for passing it along!

Expand full comment

This argument makes a lot less sense when broken down by type of crime. For example, we know that auto thefts are reported way more often than petty theft, because people need to report auto theft for insurance reasons. Looking at NIBRS, auto thefts are exploding while straight theft is down. Why would petty theft be declining while auto theft is exploding? The obvious reason is that people no longer bother to report petty theft, but still need to report auto theft.

I also don’t know how NCVS handles thefts from institutions - is a CVS employee going to say they were “victimized” by a supporter? Probably not. So if thefts from institutions and corporations are up (which is consistent with claims and observed behavior of retailers) we don’t necessarily expect to see that in the NCVS.

Expand full comment