Wait, are semi-automatic firearms becoming more common (in a way that makes this the most salient feature)? I am definitely willing to believe that semiautomatic *rifles* are becoming more common (though I would want to confirm that claim), but my understanding is that most pistols in use for about the last century have been either semiautomatics or double action revolvers, and that these aren't too different.
As I do some quick research to make sure I'm not talking out of my ass, I find the following, and sort of answer my own question. DA revolvers have the same immediately-ready-to-refire property that mostly defines semiautomatics. The main differences have to do with reload speed, ammunition capacity, and accuracy rather than the firing speed that is usually emphasized in discussions of semiautomatics. Self-loading semiautomatic pistols have slowly taken over from revolvers over the course of the 20th century, and the properties I just mentioned would make sense as drivers of fatal shooting rates, though again it's not strictly a matter of firing speed. These terms aren't super well-defined, and I think that causes problems for a lot of conversation on firearm regulation. See also bump stocks.
The evidence I am familiar with just looks at gun shot victims/crimes and does breakdowns of different recovered caliber/firearm type. I cannot say whether these mirror general population breakdowns of firearms (they will be much more slanted towards hand guns obviously). ("most ... last century" seems clearly wrong to me, I presume that is maybe a misspeak though -- bought this current century?)
The causal mechanism is larger bullets and more bullets cause more physical damage and increase the probability of death. So agree it is about how many bullets, not per se the firearm type (with the bump stock being a good example).
> Empirical evidence suggests that guns are becoming more lethal over time with larger shares featuring higher-capacity magazines and using larger-caliber bullets (D’Alessio 1999, Kellermann et al. 1991, McGonigal et al. 1993,Webster et al. 1992,Wintemute 1996). For instance, in Boston, higher-capacity semiautomatic pistols capable of shooting more bullets replaced revolvers as the most frequently recovered type of handgun beginning in the 1990s (Braga 2017). As a consequence, the share of smaller-caliber handguns among crime gun recoveries decreased over the 2000s as the prominence of larger-caliber handguns increased.
> ("most ... last century" seems clearly wrong to me, I presume that is maybe a misspeak though -- bought this current century?)
It's the usual thing where "the last century" kind of means "the 20th century" in my brain, no matter how deep into the 21st we get. Your cite putting the transition in the 1990s accords with what I was trying to convey. I wouldn't really be surprised to hear it started in the 1980s, though there obviously could be a bit of a time lag between people adopting semiautomatics over revolvers and those guns being recovered in crime scenes.
Double action revolvers would have been in use since at least the world war period, I think, and they are equivalent to semiautomatics in terms of rate of fire but noticeably less accurate than single action revolvers or semiautomatic pistols, and obviously don't have a semiauto's ammunition capacity.
All that said: I know just enough about firearms to know I'm a bit out over my skis talking about what was popular when, so take all this with a grain of salt.
The rule of thumb that a change in counts of independent events from N1 to N2 is probably more than random noise if |N1-N2| > sqrt(N1 + N2) is still pretty useful, and I think it's hard to get much more information by checking if shootings and fatal shootings moved in the same direction. (In the case of crimes, of course, you should also adjust N1 and N2 for population change, because more people is a real but boring reason for more total crimes.) We know the rule is imperfect beyond just being only true on average -- it's not 50 independent events if fifty migrants die of heat stroke in a semi trailer in San Antonio or a very well armed nutter cuts loose at a concert. The reason the deaths happened that year instead of a year later or earlier could be all but random from a policy standpoint.
In contrast to murder fluctuations in cities that might have 20 murders a year, it's a big country, and any change in murder rates nationally over ~2% is not "just random." Something happened, now you can argue over what that was and whether it represents anything lasting and worth worrying about.
The distinction between "sustained trend" and "sustained difference" is also important. The murder rate increase since 2014 has been a sustained difference, that didn't and I believe still won't go away in just a few years. There is much less basis nationally to claim a sustained upward trend.
I don't know whether major harm to a city's reputation typically constitutes a sustained crime trend or only a sustained crime difference. There are chronic negatives, such as the many year accumulation of effects from reduced preference of businesses and residents to locate there, and acute negatives like riots. A worse city reputation supports a sustained increase in crime rates, but support for a sustained upwards trend is mixed.
I would take it a step further. In gang shootings, there will be fire until something stops the fire:
1. One side is too injured (or dead) to continue.
2. Police or other body shows up that leads to the stopping of fire.
Example from Chicago (although my politics are radically different from heyjackass.com, I quite like that website): 2893 total shot this year as of this morning), versus 139783 shots detected by microphone this year. I.e. shooting continues until its purpose is achieved, or irreparably frustrated (for the time being).
The Poissonian statistics to which you refer (hence the square root) may apply to gun involved aggrevated assault, (failedly) attempted murder, and (successfully attempted) murder, but probably not meaningfully to individual shots, as they pertain to the detected crime rates.
Without looking too closely, I suspect that article is badly spun. Low-population counties, typically rural, can end up with very high or very low murder rates just due to random noise, so you expect to see low population counties near the top and the bottom of any list of murder rates by county. Anyone who would fail to point that out is selling something.
Dividing by population makes comparisons less unfair. It does not make them fair. Suppose you have 5 counties of 5,000 people each, and one of them had a murder last year. That adds up to 4 counties with a murder rate of 0, and one with a murder rate of 20 per 100,000, which is way more dangerous than NYC. But it may all be a mirage. None of those counties is a murder-free paradise or more dangerous than NYC. It's just the kind of statistical noise you get when counting just a handful of events, which does not happen when you are counting hundreds.
That is why you expect the most and least dangerous counties to both be low population areas just based on statistical noise, even if there is no actual trend... except there is a trend. Murder rates are higher on average in urban areas. (For suicide I believe the opposite is true.) The Center for American Progress is engaging in spin.
Another statistical detail that needs compensating for is that the average percent increase in murder rates from year to year across subdivisions tends to be a positive number even when murder rates don't change at all, and the smaller the subdivisions, the larger the effect. If County A goes from 1 murder to 2 and County B goes from 2 murders to 1, the average change is (+100% - 50%)/ 2 = +25% even though the murder rate has not changed at all. There are a lot of cheap tricks biased people on the left or right can use to manipulate crime figures, and the best solution I can suggest is to get your figures from someone like Jeff who knows what they're doing and isn't trying to play games.
To expand a bit on that last point: once you get down to very low base rates, a lot of reporting shenanigans related to timing and recency start to come into play. If one of the five suspicious deaths in the area in the last three years happened in March, and you presume it was a suicide for a while, that might help the sheriff keep his "kept the murder rate down" ads going through the election. If there are five suspicious deaths every *month*, that strategy probably isn't available.
To hazard a guess that Jeff is welcome to correct -- I think we have a very poor idea what the answer to your question is. If we had good murder rate totals nationally, it typically wouldn't take nine months to figure out how many people got shot last year. Instead, there are some places where somebody, either the cops or a newspaper, keeps up to date homicide totals, and that have enough total homicides to make it worth the effort for Jeff to track, and these places are rarely rural.
This is a little off-topic, but it's a question about crime (and especially murder) stats that I've had for a while that I don't know if you've answered.
It is generally known that crime increases with population density even when scaled for population, for all sorts of reasons known and unknown. We can come up with explanations for this, but it's hard to shake the basic trend, and that makes it difficult to compare across regions with varying population density. But since most crimes that we're interested in are interactions between two people (perpetrator and victim), I wonder: what happens when you scale by population squared? Does that control for population in a way that regular per-capita scaling doesn't? I'm not sure if you've ever covered this idea or things related to it, but I'd be curious to know what's been done on this issue.
I think to the extent there's any relationship between population density and crime, population squared would grossly overcount it. NYC is easily our densest city and its murder rate is moderate. LA is the other largest city, and second densest, and its murder rate is also not too bad. Plenty of rural areas have elevated murder rates. A prediction system that gets NYC, Singapore and Tokyo totally wrong would be a hard sell.
My understanding is that the typical linear population scaling already overcounts murder rates. I'm open to being proven wrong, but that idea is why I was thinking about quadratic scaling in the first place - by increasing the denominator, you account better for the fact that increasing population density gives you both more potential murderers and more potential murder victims at the same time.
(Also I believe the densest municipality in the US happens to be Union City, NJ, not to be confused with the Township of Union, NJ, seat of Union County, NJ. Union City is in Hudson County, across the river from NYC proper, and therefore arguably part of the NYC megalopolis despite the state line, but my ex worked there, brought that up all the time, and would be annoyed if I didn't bring it up.)
Yes, I should have said "large city," though I certainly wasn't aware that Union City was tops. As for population density, I think most people just want to know how likely they are to be killed, and are not reassured by just how many people they may pass in a day who don't kill them.
Fatality rates for shootings in general have actually been *increasing* over time, https://jamanetwork.com/journals/jamanetworkopen/article-abstract/2788467. The likely reason is semi-automatic and larger caliber firearms becoming more common in shootings.
Thanks for sending along, hadn't seen that. Not inherently what is seen elsewhere with PD shooting data though few have pre-2010, here's NYC:
Year % Fatal
2006 22%
2007 20%
2008 18%
2009 19%
2010 21%
2011 19%
2012 17%
2013 17%
2014 17%
2015 20%
2016 18%
2017 18%
2018 21%
2019 19%
2020 19%
2021 21%
2022 20%
Wait, are semi-automatic firearms becoming more common (in a way that makes this the most salient feature)? I am definitely willing to believe that semiautomatic *rifles* are becoming more common (though I would want to confirm that claim), but my understanding is that most pistols in use for about the last century have been either semiautomatics or double action revolvers, and that these aren't too different.
As I do some quick research to make sure I'm not talking out of my ass, I find the following, and sort of answer my own question. DA revolvers have the same immediately-ready-to-refire property that mostly defines semiautomatics. The main differences have to do with reload speed, ammunition capacity, and accuracy rather than the firing speed that is usually emphasized in discussions of semiautomatics. Self-loading semiautomatic pistols have slowly taken over from revolvers over the course of the 20th century, and the properties I just mentioned would make sense as drivers of fatal shooting rates, though again it's not strictly a matter of firing speed. These terms aren't super well-defined, and I think that causes problems for a lot of conversation on firearm regulation. See also bump stocks.
The evidence I am familiar with just looks at gun shot victims/crimes and does breakdowns of different recovered caliber/firearm type. I cannot say whether these mirror general population breakdowns of firearms (they will be much more slanted towards hand guns obviously). ("most ... last century" seems clearly wrong to me, I presume that is maybe a misspeak though -- bought this current century?)
The causal mechanism is larger bullets and more bullets cause more physical damage and increase the probability of death. So agree it is about how many bullets, not per se the firearm type (with the bump stock being a good example).
See https://www.annualreviews.org/doi/abs/10.1146/annurev-criminol-061020-021528 for citations of multiple studies. Here is one quote:
> Empirical evidence suggests that guns are becoming more lethal over time with larger shares featuring higher-capacity magazines and using larger-caliber bullets (D’Alessio 1999, Kellermann et al. 1991, McGonigal et al. 1993,Webster et al. 1992,Wintemute 1996). For instance, in Boston, higher-capacity semiautomatic pistols capable of shooting more bullets replaced revolvers as the most frequently recovered type of handgun beginning in the 1990s (Braga 2017). As a consequence, the share of smaller-caliber handguns among crime gun recoveries decreased over the 2000s as the prominence of larger-caliber handguns increased.
> ("most ... last century" seems clearly wrong to me, I presume that is maybe a misspeak though -- bought this current century?)
It's the usual thing where "the last century" kind of means "the 20th century" in my brain, no matter how deep into the 21st we get. Your cite putting the transition in the 1990s accords with what I was trying to convey. I wouldn't really be surprised to hear it started in the 1980s, though there obviously could be a bit of a time lag between people adopting semiautomatics over revolvers and those guns being recovered in crime scenes.
Double action revolvers would have been in use since at least the world war period, I think, and they are equivalent to semiautomatics in terms of rate of fire but noticeably less accurate than single action revolvers or semiautomatic pistols, and obviously don't have a semiauto's ammunition capacity.
All that said: I know just enough about firearms to know I'm a bit out over my skis talking about what was popular when, so take all this with a grain of salt.
Rambling a bit...
The rule of thumb that a change in counts of independent events from N1 to N2 is probably more than random noise if |N1-N2| > sqrt(N1 + N2) is still pretty useful, and I think it's hard to get much more information by checking if shootings and fatal shootings moved in the same direction. (In the case of crimes, of course, you should also adjust N1 and N2 for population change, because more people is a real but boring reason for more total crimes.) We know the rule is imperfect beyond just being only true on average -- it's not 50 independent events if fifty migrants die of heat stroke in a semi trailer in San Antonio or a very well armed nutter cuts loose at a concert. The reason the deaths happened that year instead of a year later or earlier could be all but random from a policy standpoint.
In contrast to murder fluctuations in cities that might have 20 murders a year, it's a big country, and any change in murder rates nationally over ~2% is not "just random." Something happened, now you can argue over what that was and whether it represents anything lasting and worth worrying about.
The distinction between "sustained trend" and "sustained difference" is also important. The murder rate increase since 2014 has been a sustained difference, that didn't and I believe still won't go away in just a few years. There is much less basis nationally to claim a sustained upward trend.
I don't know whether major harm to a city's reputation typically constitutes a sustained crime trend or only a sustained crime difference. There are chronic negatives, such as the many year accumulation of effects from reduced preference of businesses and residents to locate there, and acute negatives like riots. A worse city reputation supports a sustained increase in crime rates, but support for a sustained upwards trend is mixed.
I would take it a step further. In gang shootings, there will be fire until something stops the fire:
1. One side is too injured (or dead) to continue.
2. Police or other body shows up that leads to the stopping of fire.
Example from Chicago (although my politics are radically different from heyjackass.com, I quite like that website): 2893 total shot this year as of this morning), versus 139783 shots detected by microphone this year. I.e. shooting continues until its purpose is achieved, or irreparably frustrated (for the time being).
The Poissonian statistics to which you refer (hence the square root) may apply to gun involved aggrevated assault, (failedly) attempted murder, and (successfully attempted) murder, but probably not meaningfully to individual shots, as they pertain to the detected crime rates.
Cities you track are decreasing. Is that decrease mirrored in rural areas?
https://www.ncja.org/crimeandjusticenews/gun-violence-rates-in-rural-areas-match-or-outpace-cities#:~:text=Rural%20communities%20are%20experiencing%20high,age%2Dadjusted%20homicides%20per%20100%2C000.
Without looking too closely, I suspect that article is badly spun. Low-population counties, typically rural, can end up with very high or very low murder rates just due to random noise, so you expect to see low population counties near the top and the bottom of any list of murder rates by county. Anyone who would fail to point that out is selling something.
Rates allow jurisdictions of different populations to be compared equally
Dividing by population makes comparisons less unfair. It does not make them fair. Suppose you have 5 counties of 5,000 people each, and one of them had a murder last year. That adds up to 4 counties with a murder rate of 0, and one with a murder rate of 20 per 100,000, which is way more dangerous than NYC. But it may all be a mirage. None of those counties is a murder-free paradise or more dangerous than NYC. It's just the kind of statistical noise you get when counting just a handful of events, which does not happen when you are counting hundreds.
That is why you expect the most and least dangerous counties to both be low population areas just based on statistical noise, even if there is no actual trend... except there is a trend. Murder rates are higher on average in urban areas. (For suicide I believe the opposite is true.) The Center for American Progress is engaging in spin.
Another statistical detail that needs compensating for is that the average percent increase in murder rates from year to year across subdivisions tends to be a positive number even when murder rates don't change at all, and the smaller the subdivisions, the larger the effect. If County A goes from 1 murder to 2 and County B goes from 2 murders to 1, the average change is (+100% - 50%)/ 2 = +25% even though the murder rate has not changed at all. There are a lot of cheap tricks biased people on the left or right can use to manipulate crime figures, and the best solution I can suggest is to get your figures from someone like Jeff who knows what they're doing and isn't trying to play games.
To expand a bit on that last point: once you get down to very low base rates, a lot of reporting shenanigans related to timing and recency start to come into play. If one of the five suspicious deaths in the area in the last three years happened in March, and you presume it was a suicide for a while, that might help the sheriff keep his "kept the murder rate down" ads going through the election. If there are five suspicious deaths every *month*, that strategy probably isn't available.
To hazard a guess that Jeff is welcome to correct -- I think we have a very poor idea what the answer to your question is. If we had good murder rate totals nationally, it typically wouldn't take nine months to figure out how many people got shot last year. Instead, there are some places where somebody, either the cops or a newspaper, keeps up to date homicide totals, and that have enough total homicides to make it worth the effort for Jeff to track, and these places are rarely rural.
This is a little off-topic, but it's a question about crime (and especially murder) stats that I've had for a while that I don't know if you've answered.
It is generally known that crime increases with population density even when scaled for population, for all sorts of reasons known and unknown. We can come up with explanations for this, but it's hard to shake the basic trend, and that makes it difficult to compare across regions with varying population density. But since most crimes that we're interested in are interactions between two people (perpetrator and victim), I wonder: what happens when you scale by population squared? Does that control for population in a way that regular per-capita scaling doesn't? I'm not sure if you've ever covered this idea or things related to it, but I'd be curious to know what's been done on this issue.
I think to the extent there's any relationship between population density and crime, population squared would grossly overcount it. NYC is easily our densest city and its murder rate is moderate. LA is the other largest city, and second densest, and its murder rate is also not too bad. Plenty of rural areas have elevated murder rates. A prediction system that gets NYC, Singapore and Tokyo totally wrong would be a hard sell.
My understanding is that the typical linear population scaling already overcounts murder rates. I'm open to being proven wrong, but that idea is why I was thinking about quadratic scaling in the first place - by increasing the denominator, you account better for the fact that increasing population density gives you both more potential murderers and more potential murder victims at the same time.
(Also I believe the densest municipality in the US happens to be Union City, NJ, not to be confused with the Township of Union, NJ, seat of Union County, NJ. Union City is in Hudson County, across the river from NYC proper, and therefore arguably part of the NYC megalopolis despite the state line, but my ex worked there, brought that up all the time, and would be annoyed if I didn't bring it up.)
Yes, I should have said "large city," though I certainly wasn't aware that Union City was tops. As for population density, I think most people just want to know how likely they are to be killed, and are not reassured by just how many people they may pass in a day who don't kill them.
To be clear, I meant the Union City stuff above to be tongue-in-cheek. Tone on the internet, etc etc.
https://www.cdc.gov/mmwr/volumes/72/wr/mm7213a2.htm