By Our Representative
In a US Congressional briefing, two Facebook employees-turned-whistleblowers Frances Haugen and Sophie Zhang have slammed a human rights report from Meta, the company that owns Facebook, for failing to acknowledge its role in spreading disinformation and hate speech in India, especially from those belonging to India’s ruling Hindu nationalist Bharatiya Janata Party (BJP).
That Meta’s first-ever global Human Rights Impact Assessment (HRIA) report released in July has failed to address its complicity in the spread of disinformation in India underscored that the social media giant prioritized profit over combating hate, Haugen and Zhang said at the Congressional briefing, organised by several US-based civil rights groups, many of them consisting of Indian diaspora.
The briefing was co-hosted by Genocide Watch, World Without Genocide, Indian American Muslim Council, Hindus for Human Rights, International Christian Concern, Jubilee Campaign, 21Wilberforce, Dalit Solidarity Forum, New York State Council of Churches, Federation of Indian American Christian Organizations of North America, India Civil Watch International, Center for Pluralism, International Commission for Dalit Rights, American Muslim Institution, Students Against Hindutva Ideology, International Society for Peace and Justice, Humanism Project and Association of Indian Muslims of America.
Haugen, who turned a global celebrity last year upon sharing tens of thousands of incriminating documents with the US Securities and Exchange Commission, dismissed Meta’s claim of protecting human rights and providing remedies for negative impacts.
“Facebook's report points [that] they have an oversight board that people can appeal to, that they're transparent about what they take down,” Haugen said during the virtual Briefing. "But the reality is that they won't give us even very basic data on what content moderation systems exist in which languages and the performance of those systems.”
Facebook had “under-invested” in high-quality content moderation systems and “rolled out the bare minimum” for them, she said. “They won't even let us see samples of how these systems perform [as] activists are having their content taken down.”
The only country you can actually read about in Meta’s report is the US, which has the “safest, most sanitized version of Facebook… the cleanest corner of Facebook.” On the other hand, non-English languages “get less investment [and] quality assessment.”
Facebook’s own regular checks of the top ten posts in countries facing conflict found that “post after post would be horrific [with] gory images, severed heads… We would sit there and discuss, like, how did this get through? Why was this getting the most distribution?” Haugen said of her time as a product manager with Facebook’s civic integrity department.
“Facebook's products are designed to give the most reach to the most extreme content,” and its algorithms were “intrinsically majoritarian” as the content that gets a better reaction from the majority gets “more distribution… Human rights and Facebook are intertwined. We can't advance human rights, [and] we can't have safe discourse unless Facebook actively participates and has a relationship with the public,” Haugen claimed.
In a US Congressional briefing, two Facebook employees-turned-whistleblowers Frances Haugen and Sophie Zhang have slammed a human rights report from Meta, the company that owns Facebook, for failing to acknowledge its role in spreading disinformation and hate speech in India, especially from those belonging to India’s ruling Hindu nationalist Bharatiya Janata Party (BJP).
That Meta’s first-ever global Human Rights Impact Assessment (HRIA) report released in July has failed to address its complicity in the spread of disinformation in India underscored that the social media giant prioritized profit over combating hate, Haugen and Zhang said at the Congressional briefing, organised by several US-based civil rights groups, many of them consisting of Indian diaspora.
The briefing was co-hosted by Genocide Watch, World Without Genocide, Indian American Muslim Council, Hindus for Human Rights, International Christian Concern, Jubilee Campaign, 21Wilberforce, Dalit Solidarity Forum, New York State Council of Churches, Federation of Indian American Christian Organizations of North America, India Civil Watch International, Center for Pluralism, International Commission for Dalit Rights, American Muslim Institution, Students Against Hindutva Ideology, International Society for Peace and Justice, Humanism Project and Association of Indian Muslims of America.
Haugen, who turned a global celebrity last year upon sharing tens of thousands of incriminating documents with the US Securities and Exchange Commission, dismissed Meta’s claim of protecting human rights and providing remedies for negative impacts.
“Facebook's report points [that] they have an oversight board that people can appeal to, that they're transparent about what they take down,” Haugen said during the virtual Briefing. "But the reality is that they won't give us even very basic data on what content moderation systems exist in which languages and the performance of those systems.”
Facebook had “under-invested” in high-quality content moderation systems and “rolled out the bare minimum” for them, she said. “They won't even let us see samples of how these systems perform [as] activists are having their content taken down.”
The only country you can actually read about in Meta’s report is the US, which has the “safest, most sanitized version of Facebook… the cleanest corner of Facebook.” On the other hand, non-English languages “get less investment [and] quality assessment.”
Facebook’s own regular checks of the top ten posts in countries facing conflict found that “post after post would be horrific [with] gory images, severed heads… We would sit there and discuss, like, how did this get through? Why was this getting the most distribution?” Haugen said of her time as a product manager with Facebook’s civic integrity department.
“Facebook's products are designed to give the most reach to the most extreme content,” and its algorithms were “intrinsically majoritarian” as the content that gets a better reaction from the majority gets “more distribution… Human rights and Facebook are intertwined. We can't advance human rights, [and] we can't have safe discourse unless Facebook actively participates and has a relationship with the public,” Haugen claimed.
Fired from Facebook in 2020 as a data scientist after she exposed its failure to combat allegedly fake and abusive content, Zhang said Facebook faced “the most political interference in India” and was “most deferential” to the Indian government because of India’s “increased willingness” to threaten action and “the lack of public reaction” in support of a “tougher line” in India.
“Facebook has effectively conducted a massive donation in kind to authoritarian governments by refusing to act and allowing their bad behavior to continue,” Zhang said, adding that it was “biased towards those in power.” The people “who can regulate Facebook and force you to change the situation have no incentive to change [it.] The only people who want to change the situation are those not in power who cannot change it.”
Facebook’s failure to control hate speech and disinformation in India could have serious consequences. “if Facebook leads to the degradation of democracy in India, that will hurt its relationships with the United States and American interests globally,” she said.
Facebook’s failure to control hate speech and disinformation in India could have serious consequences. “if Facebook leads to the degradation of democracy in India, that will hurt its relationships with the United States and American interests globally,” she said.
Zhang said Meta refused to close fake accounts in India that she uncovered because they were linked to a BJP member of Parliament. “As soon as the discovery was made, I could not get an answer from anyone. It was as if they had stonewalled me,” she said. “Facebook did not want to say yes because they were afraid of any important parliamentary figure.”
Facebook cared “not about saving the world and protecting democracy. It cares about its profit. [It] has a strong incentive to be solicitous and differential towards the ruling party.”
Facebook cared “not about saving the world and protecting democracy. It cares about its profit. [It] has a strong incentive to be solicitous and differential towards the ruling party.”
Comments