Facebook’s design makes it easier to spread anti-vaccine messages, study finds

Posted on
Covid-19 anti-vaccines
A woman at an anti-vaccine protest in Barcelona in 2021.Carles Ribas (THE COUNTRY)

Messages such as “vaccines kill more people than covid” or “vaccines contain baby tissue” are some of the false claims that Facebook users shared, starting in March 2020. In an attempt to combat misinformation about vaccines, Facebook removed these statements from its walls, but it was unable to prevent the interest that this type of content provoked on its social network. Anti-vaccine users took advantage of Facebook’s architecture to tread new paths toward vaccine misinformation: their content became “more misinformative, more politically polarized, and more likely to appear in users’ news channels,” according to the published study. this Friday by the magazine Science Advances.

The study’s researchers used the CrowdTangle tool, owned by Facebook parent company Meta, to download public data from the company. They analyzed 200,000 posts from Facebook pages and groups, created before the platform eliminated “Stop Mandatory Vaccination” in November 2020, one of the largest anti-vaccine pages on the social network. On December 3, 2020, Mark Zuckeberg’s group announced that they would remove false claims about Covid vaccinations and accounts that posted such claims from the platform. Meta, Facebook’s parent company, eliminated more than 27 million pieces of content, according to Bloomberg. Now, the company no longer wants to remove false Covid claims.

The largest networking platform in the world, with 2,073 million active users, according to Statista, is characterized by being very flexible when it comes to adapting content to the needs of users. Its “layered hierarchy” design, composed of pages (at the top), groups (in the middle) and other users (at the bottom) offers alternative paths to anti-vaccine content and makes it easier to access, according to the study. “If the content of a page is deleted, the layered hierarchy allows users and groups to find similar (or even the same) content published on another page,” George Washington University professor David Broniatowski and professor at EL PAÍS explained to EL PAÍS. author of the study. In other words, the administrators of these pages can share content with links to other pages.

And it is there, on Facebook pages and groups, where the political information consumed by its fundamentally conservative audience is shared. This is what Spanish researcher Sandra González-Bailón, from the University of Pennsylvania, explained to EL PAÍS in August: “In terms of volume, there are many more users than pages or groups. But if we remove pages and groups, there is less segregation. There’s something intuitive about it because you don’t choose your family members, but you do choose which pages and groups you follow. There is more self-selection. But it is a decision of the platform design and how it is governed. We see that pages and groups create more division instead of helping to build bridges,” she explains.

In addition to its design, the likes They can also promote anti-vaccine content. How many more I like or reactions of angry There is in Facebook messages, the more likely they are to appear in other users’ newsfeeds, due to their “significant social interaction,” according to the researchers. If anti-vaccine content creators manipulate these reactions, they increase the exposure of their content online. Asked about the study, Professor David García, from the University of Konstanz (Germany), points out that “reducing the value of the angry emoji, in cases of medical misinformation, is a good idea because we know that expressions of moral indignation increase polarization. ”. Although Facebook reduced the weight of angry reactions to zero in September 2020, the study reports that interaction levels with false content did not decrease compared to interactions prior to the removal policy.

Rafael Rubio, an expert in disinformation at the Complutense University of Madrid, explains that Facebook’s algorithm favors disinformation because “it reduces the plurality of information received and multiplies the exposure of similar information.” To do this, he suggests certain changes to the platform’s rules: “The complaints procedure can be improved, increasing its transparency, and reducing the scope of news dissemination, which directly affects the algorithm.”

The study and experts agree that the social network must review its policy against misinformation to avoid risks, and propose alliances between large platforms to combat the problem. Broniatowski suggests social media designers collaborate to develop a “building code” that promotes the health of everyone on the network, and compares it to building homes. “Building architects have to balance the design objectives of a property with compliance with regulations to protect the people who use them,” he concludes.

You can follow EL PAÍS Technology in Facebook y Twitter or sign up here to receive our newsletter semanal.

Leave a Reply

Your email address will not be published. Required fields are marked *