Are high levels of existing COVID-19 population immunity in US counties associated with a lower infection rate in this current wave?
This thread contains my latest findings to this question.
Initial conclusion: No, there is practically no correlation.
The plot above shows the the percentage of the population infected before Sep 1 & after Sep 1 in each county (based on covid19-projections.com estimates).
The question is: can knowing the % infected before Sep 1 in a county predict the relative severity of this current wave?
When looking at all 3,000+ counties, the answer is no. There is practically no correlation (R^2 = 0.002) between the % infected before Sep 1 and after Sep 1.
So given a county, the COVID-19 prevalence before Sep 1 has no predictive value in determining the severity since Sep 1.
A lot of people have asked why the CDC estimates close to 100M total US COVID-19 infections (28%) by Dec 1, while covid19-projections.com only estimates 58M (17%).
I believe there are major flaws in the CDC estimates, which I will explain in this thread.
To begin, the covid19-projections.com model is tuned on serology surveys, while the CDC model is not.
While CDC estimates 7x more COVID-19 infections than reported, covid19-projections.com estimates this ratio to currently be ~3x, down from 10x in April and 4x in the summer.
Using the CDC claim that "1 in 7 total infections were reported", this would imply that 70% of North & South Dakota were infected, which doesn't pass a common sense test.
While a 7x multiplier is believable in the spring, the paper still claims this is the case in September.
Many people are unaware that the COVID-19 vaccine has significantly more side effects than the flu vaccine. I hope to see more honest discussions regarding this.
Props to @Cat_Ho for her realistic, data-centric reporting of this issue. It's much needed.