Opinion: One way to stop the dangerous spread of vaccine myths

Editor’s Note: Kara Alaimo, associate professor of public relations at Hofstra University, is the author of “Pitch, Tweet, or Engage on the Street: How to Practice Global Public Relations and Strategic Communication”. She was the spokesperson for international affairs at the Treasury Department during the Obama administration. Follow her on Twitter @karaalaimo. The opinions expressed in this commentary are solely those of the author. View more reviews at CNN.



CNN

White House Director of Communications Kate Bedingfield on Tuesday said the Biden administration was examining whether social media companies should be held legally responsible for spreading false information on their sites by amending Section 230 of the Communications Decency Act, which protects their ability to moderate users. content. The statement comes at a time when disinformation – particularly about Covid-19 vaccines – is sparking both public outcry and the belief among many that social media companies should be more aggressive in their efforts to fight back.

c / o Kara Alaimo

Kara Alaimo

White House officials have expressed outrage over the failure of Facebook and other platforms to crack down on false vaccine claims amid the growing number of coronavirus cases, hospitalizations and deaths in recent days. Last week, President Joe Biden asserted that social media is “killing people “ by allowing health misinformation to proliferate, even though on Tuesday he echoed those words and instead blamed the perpetrators of that misinformation.

Facebook, for its part, does not agree with claims that it is responsible for fueling disinformation. A spokesperson told CNN that Biden’s claims that tech companies are responsible for spreading misinformation about vaccines “are not supported by the facts. The point is, over 2 billion people have viewed authoritative information about COVID-19 and vaccines on Facebook, which is more than any other place on the internet. ”

As the coronavirus continues to make Americans and so many others sick around the world, people (especially those fortunate enough to live in a country with solid access to the Covid-19 vaccine) should be encouraged to obtain vaccines that can not only protect them against serious illness and death. , but also prevent them from spreading the virus to those around them who are not eligible for vaccination – such as children and people with certain medical conditions. The spread of vaccine misinformation must be stopped.

This is why the White House is right to question Article 230. It needs to be updated. But the exceptions to this law must be extremely narrow and focus on widespread disinformation that clearly threatens lives.

According to the Center for Countering Digital Hate, only 12 people are responsible for 65% of the vaccine misinformation circulating online. The organization found 812,000 cases of anti-vaccine content on Facebook and Twitter between February 1 and March 16, 2021, which it said was just a “sample” of the misinformation that is spreading widely.

The failure of tech companies to stop it is unacceptable. But rather, according to the Centre’s report, The disinformation has sometimes been recommended by Instagram (owned by Facebook) to its users. And, even when this bogus content was reported to social media companies, they overwhelmingly refused to take action against it. While the Center blames Facebook, Twitter and Google for failing to identify and remove anti-vaccine content, it notes that “the magnitude of the disinformation on Facebook, and therefore the impact of their failure, is greater.

Many internet activists oppose changing Section 230, as removing its protections against the legal liability of online intermediaries who host or repost content could limit our ability to have large-scale conversations on networks. social, including those on controversial topics. And, clearly, it wouldn’t be possible for tech companies to monitor and verify all the conversations we have on social media every day. “Attacking Article 230 of the CDA only shows that you have no idea what you are talking about when it comes to ending online abuse,” Zoë Quinn, who has been the victim of false allegations online that she slept with a reviewer in order to get him to write a glowing review of a game she created that was inundated with death threats and other abuse, written in his book Crash Override: How Gamergate (almost) destroyed my life and how we can win the fight against hate online.

But there is a way to protect the openness of the internet and the ability of social media to function while cracking down on lies that cause mass damage. Congress should pass a law holding tech companies accountable for removing content that directly endangers lives and reaches mass reach – like more than 10,000 likes, comments or shares. The definition of endangering life should also be narrow. This should include serious threats to public health – like misinformation about vaccines – or other direct invitations to cause serious harm to ourselves or others.

This requirement of this kind of up-to-date legislation would allow tech companies to focus their efforts on controlling content that spreads widely (and, by the way, also makes them the most money, as social media is growing. rely on popular content to keep people on their sites so they can earn ad revenue). The content with the most reach and engagement is, of course, the most influential and therefore potentially dangerous.

Of course, there are many legal precedents for this. As I have already pointed out, it is constitutional to restrict freedom of expression in limited cases, for example when it threatens to facilitate crimes or poses imminent and real threats. Clearly, information that fuels a deadly pandemic is admissible.

Such a law would also carry a serious danger which cannot be ruled out and must be addressed: politicians could try to use it to obstruct the dissemination of information which they do not like and which is in fact true. (Remember how former President Trump frequently referred to the exact reports he disliked as “fake news?”). That is why the arbiters of truth in such cases should be federal judges, appointed by the president but confirmed by the Senate and expected to be impartial. The Department of Justice and state attorneys general could sue social media for failing to remove the deadly disinformation that is spreading widely on their platforms, such cases could be decided by a panel of judges (the better. protect against a single militant lawyer) and tech companies breaking the law face financial fines.

The real idea here is that the prospect of financial penalties and the damage to public relations that accompanies prosecutions would prompt social networks to step up their disinformation policing to avoid facing prosecution in the first place. This would primarily leave the responsibility of finding and closing dangerous and widespread false information to businesses.

Of course, that’s exactly what happens with copyrighted material. Copyright infringement is not protected by Section 230, so when a user shares copyrighted material on a social network without permission, the copyright owner can sue the copyright. platform in damages. This is why social media has become so savvy to remove such content – and how we have ended up in past situations like when Twitter deleted a clip posted by former President Trump of the Nickelback Group while separately allowing him to ‘invoke the prospect of civil war. .

If tech companies can figure out how to remove clips that harm people’s business interests, surely they can figure out how to remove posts that threaten our lives too.

Receive our free weekly newsletter

Social media could have avoided this kind of regulation by doing a better job of tackling disinformation in the first place. But they have long tried to shirk responsibility for the social effects of the disinformation that spreads on their platforms. In 2017, Facebook CEO Mark Zuckerberg used his voting rights to block a shareholder resolution it would have required the company to simply report publicly on how it deals with disinformation and the impact of its disinformation policies.

Like the viruses that vaccines protect us from, misinformation has become explosively contagious and deadly on social media. Congress should vaccinate us against some of the worst while maintaining the viability of broad, unhindered speech that doesn’t threaten lives on social media.



Source link