The Democrats are just being Democrats and demanding that everything be shut down.
The Republicans, on the other hand, are playing this “Democrats are the real X” game, and telling the pollsters that they think Democrats are doing bad things.
With the Biden administration’s efforts to vaccinate Americans hitting a brick wall and the delta variant wreaking havoc nationwide, lawmakers in recent weeks have increasingly pointed fingers at social media companies for allowing COVID-19 misinformation to spread on their platforms.
New Morning Consult polling suggests that much of the public agrees that social media plays a role in halting progress toward a true post-pandemic existence — and nearly two-thirds back legislation that would punish the platforms that enable such misinformation to proliferate.
Sixty-three percent of adults said they’d support a federal bill holding internet platforms responsible if content generated by their users and other third parties spread misinformation about COVID-19 vaccines and public health emergencies, including 78 percent of Democrats, 57 percent of independents and 52 percent of Republicans.
By contrast, just over 1 in 5 adults — including 12 percent of Democrats, 23 percent of independents and 32 percent of Republicans — said they were against holding platforms accountable for the spread of misinformation amid public health crises. The share of those who didn’t know or had no opinion ranged from 10 percent (Democrats) to 21 percent (independents).
The poll of 2,201 U.S. adults was conducted July 23-25, just days after Sen. Amy Klobuchar (D-Minn.) introduced legislation that would revoke Section 230 liability protections for internet companies that fail to crack down on the dissemination of misinformation during public health emergencies, though the survey did not reference Klobuchar or Section 230 specifically. The poll has a margin of error of 2 percentage points.
Last week, Amy Klobuchar introduced a bill that would strip Facebook of its immunity from lawsuits under Section 230 of the Communications Decency Act if it algorithmically promotes health misinformation during a health crisis. https://t.co/NzMyusEtUm
— Brookings Institution (@BrookingsInst) July 28, 2021
When respondents were asked whom they think is most responsible for controlling the spread of false information about the coronavirus on platforms, 27 percent pointed fingers at social media companies — a 4-point rise from a March 2020 survey. Thirty-four percent of adults blamed the users who originally posted false information, virtually unchanged from the poll taken 16 months ago.
On vaccine misinformation specifically, the problem appears to be centered on a small group of prolific posters: A recent Center for Countering Digital Hate report found that 12 individuals — dubbed the “Disinformation Dozen” — were responsible for 65 percent of anti-vaccine content on social media platforms, including up to 73 percent on Facebook Inc.
Thirty-five percent of adults said they think social media companies are doing a “poor” job when it comes to preventing the spread of false information about coronavirus vaccines on their platforms, while 27 percent said the performance of companies is just “fair.”
We’ll just have to wait and see how demonizing the left as “anti-vaxx” works for the right-wing.The pressure on social media companies from Capitol Hill to do a better job of policing anti-vaccine content is likely to ramp up as the Biden administration considers mandates and other avenues to meet its vaccination goals.
And as Facebook and other social media companies point to “authoritative” vaccine information as a means to drown out false content, lawmakers can note one last data point in arguing their case for stricter moderation: Just 10 percent of Americans believe that false or misleading information about the coronavirus that is posted online should not be removed.
The “Dems are the real racists” thing worked great, so let’s just hope this project goes the same way.
Having now read Klobuchar's new bill re: Section 230, I have no clue what the point is. It targets content that is 1A protected anyway. And if liability fears do reach site operators, the only safe solution is to just ban all discussion of the health emergency.
— Bob, agent of Hydra (@dman4835) July 24, 2021
This is especially true when you create liability around a category of speech as broad as "public health info," when we know guidance can change quickly. If this law was on the books a year ago could FB be sued for amplifying a NYT article about how you don't need to wear a mask?
— Evan Greer (@evan_greer) July 22, 2021