iNDICA NEWS BUREAU-
For the hundredth time, Facebook has come under fire for engaging shady strategies to keep users addicted to its platform after a whistleblower told US lawmakers that the social media giant fuels division, harms children, and urgently needs to be regulated, drawing pledges Congress would take up long-delayed action.
On Tuesday, October 5, Mark Zuckerberg hit back at claims, saying the claim the company puts profits over safety is “just not true.”
The testimony by ex-employee Frances Haugen has fueled one of Facebook’s most serious crises yet. Haugen testified on Capitol Hill after she leaked reams of internal research to authorities and The Wall Street Journal.
The documents submitted by Haugen reference research and findings by Facebook’s own researchers. Among those, some related to India, referencing what they said were fear-mongering and dehumanizing content promoted by Facebook accounts purportedly believed to be either run by or associated with the Rashtriya Swayamsevak Sangh (RSS), the ideological fountainhead of the ruling Bhartiya Janata Party (BJP).
“RSS (Indian nationalist organization Rashtriya Swayamswvak Sangh) Users, Groups, and Pages promote fear-mongering, anti-Muslim narratives targeted pro-Hindu populations with V&I (violence and inciting) intent…,” says the complaint filed with the SEC.
The BJP and the RSS did not respond to queries seeking comment.
An “Adversarial Harmful Networks – India Case Study” cited in one of the documents quotes the author of the report as saying: “…and we have yet to put forth a nomination for designation of this group given political sensitivities”.
The document also shows that the categorization of India is in “Tier 0” alongside only the US and Brazil as what the company calls “Top 3 Political Priorities”.
Only 0.2% of the reported hate speech is taken down by automated checks, according to the documents cited, which also flags a lack of language classifiers, which will be able to check for translations as well.
The complaint says that Facebook’s internal records show how this problem of lack of Hindi and Bengali classifiers meant much of the reported content, particularly the anti-Muslim narrative, was never dealt with or flagged by the systems.
Facebook has over 340 million users in India, which makes for a large demographic of its 2.89 billion monthly active users globally.
Classifiers are automated systems and algorithms that are designed to detect hate speech in the content on Facebook.
Facebook has faced allegations of inaction against content posted by certain groups in India. The Wall Street Journal (WSJ) last year cited an internal report and said it termed Hindu nationalistic group Bajrang Dal a “dangerous” organization. But the company did not act on the report because of financial and safety concerns, the report said.
The controversy then grew to spotlight the role of the India policy team and its then head Ankhi Das. Facebook and Das maintained there was no wrongdoing at the time. Das subsequently left the company.