Can Facebook’s self-regulation ensure information hygiene? | Pakistan Press Foundation (PPF)

Pakistan Press Foundation

Can Facebook’s self-regulation ensure information hygiene?

Pakistan Press Foundation

We have spent a decade discussing the problems posed by social media platforms and looked at issues such as filter bubbles, echo chambers, information silos, viral virus, misogynist trolling and amplification of toxic content. The platform companies have now started acknowledging their role in vitiating the information universe and undermining the lives of millions.

Last week, Facebook apologised for its role in the 2018 anti-Muslim riots in Sri Lanka. An in-house probe found that incendiary content on the social media platform may have led to the deadly violence. The social media conglomerate has also launched a content oversight board, which will be able to overturn decisions by the company and Chief Executive Mark Zuckerberg on whether individual pieces of content should be allowed on Facebook and Instagram. Google has launched its Google News Initiative to help journalism thrive in the digital age.

Will these initiatives and measures by the platform companies be enough to sanitise our news and information environment? I can reasonably claim that I am not only familiar with the works of two members of the 20-member Facebook oversight group — Alan Rusbridger, former editor of The Guardian and Nighat Dad, the Executive Director of Pakistan’s Digital Rights Foundation — but also respect their contribution to a free and fair information space. Mr. Rusbridger has written an explanatory note on why he agreed to join the team. He sees the social media giant at the intersection of three humongous crises in front of credible information space: information chaos, crisis in free expression and crisis in journalism. Though he is guarded about the final outcome, Mr. Rusbridger said: “There is, in my view, no excuse for not trying. The balancing of free expression with the need for a better-organized public square is one of the most urgent causes I can imagine.

I am a votary of self-regulation but I am sceptical of the mechanism created by Facebook. First let us look at some of the proven instances of its gross failures: the anti-Muslim riots in Sri Lanka, the violence against Rohingya in Myanmar, the sordid details of the harvesting of personal data by Cambridge Analytica and the mindless censoring of the Pulitzer Prize winning photograph of the “Napalm girl” that spoke about the barbarity of war. People from other regions have their own lists of failures of the platform.

Dealing with localised propaganda

Second, there is a huge gap between the magnitude of Facebook’s size, reach, linguistic diversity and influence on one hand, and the limitation of the oversight board to look into localised propaganda on the other. As of the first quarter of 2020, Facebook has 2.6 billion monthly active users and the number surpasses 3 billion people a month if one includes the other popular products of the company such as WhatsApp, Instagram and Messenger. It is available in 101 languages.

For instance, how will the oversight board handle a toxic posting in Tamil? I have noticed a growth in bigotry, hate and abuse in Tamil-language postings. These do not reveal their true colour if they are subjected to mechanical translation without providing adequate context. What are the mechanisms to capture the gravity of the polarising content that fills cyberspace? Problematic posts should be removed on a war footing because there is a political dividend for inciting hate.

We must remember that the grim images broadcast by the killer in two mosques in Christchurch, New Zealand could not be controlled or moderated by the social media giants despite the conscious and collective decision to take down the offensive video. There seems to be no mechanism to prevent reposting of offensive content, which was taken down.

Rebecca MacKinnon, a fellow with the University of California National Center for Free Speech and Civic Engagement and co-founder of Global Voices, is convinced that the oversight board “cannot stop the exploitative collection and sharing of user data, or stop the company from deploying opaque algorithms that prioritize inflammatory content to maximize engagement.” I tend to agree with her. In print and broadcast organisations, there is a shared value system between all the stakeholders which makes self-regulation an effective mechanism. But there is no unifying objective between Mark Zuckerberg, his hand-picked corporate board, Facebook’s nearly 2.5 billion users and the oversight board, despite the individual credentials of its members.

Website: The Hindu


Comments are closed.