Do Crime Victims Say They Are Reporting Less Often To The Police?
It depends on your timeframe.
Whether crime is going down or if changes in reported crime statistics simply reflects lower reporting patterns is a critical challenge for evaluating crime trends. I’ve written before about how underreporting can obfuscate crime stats locally and explored whether it’s occurring nationally. We know that underreporting isn't driving the large decline in murder that we've seen in 2023 and 2024, but what about other crimes that are less likely to be reported?
Another way to approach the issue is to look at the National Crime Victimization Survey (NCVS). The biggest contribution that the NCVS makes to the public discourse (IMO) is that it quantifies how frequently people do not report major crimes to the police.
Table 4 of the 2022 NCVS shows the share of violent crime victims who said they reported the crime (including simple assault here for the sake of consistency over time) to the police. The share of people saying they reported their violent crime to the police fell from 46 percent in 2021 to 42 percent in 2022 while property crime reporting rose slightly from 31 to 32 percent.
This seems to strongly suggest that a smaller share of violent crime victims are reporting the crimes to the police, but the case is not quite as open and shut as the initial table makes things seem for a few reasons.
NCVS is a survey and 2021 was an especially difficult year to survey with COVID and whatnot. The 2021 survey was made even more difficult because the survey's questioning stretched back to the second half of the preceding year.
I wrote about that feature here, but to summarize: respondents are asked about whether they’ve been the victim of a crime in the 6 months before the survey being taken with responses coded to the year of survey rather than the year of offense. So 2020’s survey actually covered about half-half non-COVID and COVID months from July 2019 through November 2020 while 2021’s survey got the bulk of COVID from July 2020 through November 2021.
Looking deeper at the last few years shows that 2021’s 46 percent violent crime reporting rate was higher relative to recent years. The reporting rate for violent crimes in 2022 (41.5 percent) was higher than what showed up in NCVS 2019 and 2020 (40.9 and 40.2 percent respectively).
The 2022 report isn’t the outlier, 2021 is the outlier. I know very little about conducting polling, and certainly have no unique insights into the methodological challenges as they relate to 2021, but it would make intuitive sense for 2020 and 2021’s surveys to be a bit wonky in various ways.
Another reason to be cautious when reading too much into the NCVS figures is that, as a survey, it carries a margin of error. The margin of error for violent crime reporting has typically struck between 2 and 3 percent for most of the last 15 years meaning we can say with 95 percent confidence that the 41.5 percent is between 43.5 percent and 39.5 percent.
Applying the margin of error to the percent of violent crime victims who told the survey that they reported to the police shows that 2022’s upper bound falls within the margin of error of all but three of the last 11 years (2014, 2015 and 2021).
I went back to 2001 and pulled margins of error (it’s hard to find before that) for every year but 20061. The below graph shows the violent crime reporting percentage along with the margin of error for all those years.
And here is just the percent of people saying they reported their violent crime to the police for every year back to 1993.
Putting the above table together with the two graphs makes three important points about violent crime reporting patterns in the United States.
First, there’s no evidence that people are systemically reporting less often nationally now than they were in recent years. Of course, it’s possible that it’s happening outside of our ability to measure — and we only have data through 2022, but all NCVS says is that reporting in 2022 was roughly aligned with most of the last decade. So our most recent national crime trends probably aren't all that impacted by changing reporting patterns.
Second, NCVS does suggest that people reported violent crimes less often over the last 5 years compared to 15 or 20 years ago. It’s not a huge difference, maybe 5 to 10 percentage points, but the survey suggests there’s been a decline which is undoubtedly worth studying and understanding.
Finally, violent crime reporting trends aren’t remarkably different from what they were in the 1990s. I couldn’t find a margin of error for 1994, but the percentage reporting was lower then than it was in 2022 (which is remarkable considering how high the violent crime rate was nationally 30 years ago!).
As to whether violent crime is being reported less often now than in recent years, the evidence we do have points to ‘no’. But, as with most crime data, the data is imperfect enough that the question cannot be answered with total certainty.
Per BJS, “Due to methodological changes in the 2006 NCVS, use caution when comparing 2006 criminal victimization estimates to other years.”
Hi Jeff: The overall issue is not more or less crime reporting via BJS, it's crime reporting itself. The overwhelming majority of violent and property crime not being reported is substantiated by BJS, Statistics Canada, The Office of Juvenile Justice And Delinquency Prevention and BJS's report on identity theft. I'm fairly sure that if I went to the UK crime survey (or other nation's crime surveys) it would say the same thing.
My issue is not reminding people that reported crime is not a measure of overall (or real) crime. If the overwhelming percentage of crime is not reported (BJS-identity theft-7 percent reported-OJJDP-juvenile crime-25 percent reported), then we should not discuss "crime" in absolute terms when it's more than possible that either increases or decreases (except homicides) can be profoundly misleading. We could have a 5 percent increase or decrease in reported FBI data when the opposite could be true.
All of this was discussed over 50 years ago when the USDOJ instituted the National Crime Victimization Survey. Yet it's interesting that the latest NCVS offers huge increases in violent crime through two reports (overall crime and juvenile crime). Yet their report is ignored by 99 percent when discussing crime.
I fully support your work and the efforts of others to create a real-time overview of reported crime data. It's a solid step in the right direction.
But by anyone's objective analysis, we are not discussing crime, we are discussing reported crime and only God knows where that takes us. Ethically, we need disclaimers stating that reported crime "could" be wildly inaccurate and we should be offering links to the NCVS so accurate comparisons can be made.
All of this is being discussed in a political light which is unfortunate. The data is what the data is. After the huge increase in violent crime from the NCVS, it's inevitable that violent crime will be significantly less in their next report, so partisans will always be disappointed by data that doesn't swing their way.
The same regression effect is currently playing out via homicides after 30-50 percent increase depending on what data you use.
So to those of us who review the work of others, including the media, who are claiming huge reductions in crime requires some context because the claims (one way or another) could be significantly off.
Beyond disclaimers and context, I do not have a quick answer as to the proper and ethical way to proceed. Maybe reported crime should be described as a possible trend based on reported numbers but it needs to be offered as "reported" data, not actual crime numbers. One possible solution would be using NCVS data at the local level which BJS is currently exploring. Maybe we use AI to weigh a variety of variables (i.e., polling data) to create a crime severity index like Statistics Canada.
If this was medical research, we would never solely use reported numbers. We would not be making medical decisions based solely based on reported numbers without endless context that reported numbers could be wrong (or significantly wrong). Yet we do it for crime policy.
Again, I believe that all of this was discussed over 50 years ago with the beginning of the National Crime Victimization Survey and I'm barely scratching the surface of the problems with reported crime data.
Yet I'm under no illusion that anything will change because people want simplicity. If a reporter asks me if crime is up or down, they want a five second answer, not an overview of the possibilities. A common reporter question, "what the hell is the National Crime Victimization Survey"?
Thanks for "listening." Len.
"I couldn’t find a margin of error for 1994, but the percentage reporting was lower then than it was in 2022 (which is remarkable considering how high the violent crime rate was nationally 30 years ago!)."
While it might it seem unexpected at first glance that reporting would be lower during a crime spike, I don't think it should be. During spikes, resources don't usually increase commensurately with the increase in crime. Moreover, people can see that public safety capacity is stretched as indicators like 911 response times and case clearance rates worsen. That might understandably cause some to decide that reporting crime is not worth the effort.