Nairobi has become the battleground for two Ethiopians seeking more than Sh250 billion from Facebook parent company Meta in compensation for victims of hate and violence allegedly fueled by the social media giant.
The precedent-setting case filed in the High Court accuses Meta Platforms Inc of fueling violence in East Africa by failing to moderate inciteful messages posted on Facebook.
Mr Abrham Meareg says his father- Prof Meareg Amare Abrha was killed in November last year amid the war between the Ethiopian government and the Tigray People’s Liberation Front (TPLF).
He says the post on Facebook profiled and accused him of associating with TPLF.
Another petitioner Fisseha Tekle, a legal advisor at Amnesty International says human rights groups cannot protect people’s rights if social media people rely on it for news and connection which fuels hate and disinformation.
Through lawyer Mercy Mutemi, the two want the High Court to compel Meta to create a restitution fund of Sh200 billion for victims of hate and violence perpetrated by Facebook.
“Upon the finding of liability as prayed in above, an order establishing the Facebook Advertisements Victims Fund in Kenya to be administered by this honourable court or its nominee and to which the respondent shall be required to deposit Sh50 billion for the benefit of any Facebook user in Kenya who has been shown a boosted or sponsored post containing content that constitutes inciteful, hateful and dangerous speech,” the petition states.
This is the second case Meta is facing in Kenya after a South African man, who was employed as a content moderator accused the giant social media company of exploitation and poor working conditions at its Nairobi office.
The case is pending a ruling on whether the High Court has jurisdiction to hear the matter.
The latest case accuses Meta of failing to take action against inciteful, hateful and dangerous content in order to protect its business interests.
“Put otherwise, the respondent benefits from the prioritisation of hateful, inciteful and dangerous content on its platform,” the petition reads.
Ms Mutemi says Facebook has become one of the channels through which terrorist propaganda is shared and although the company is technologically able to adjust and restrict Facebook’s viral algorithm, it has chosen to ignore such content in East and Southern Africa.
She said the company has previously protected Facebook users in the US, but failed to invoke it to protect communities elsewhere.
She says on 6th January 2021 when there was an attack on the US Capitol that disrupted a joint session of Congress, Meta deployed the ‘Break the Glass’ (BTG) procedure, which was able to mute hateful and dangerous content.
The BTG is a long series of specific algorithmic changes deployed in times of crisis situations where dangerous content is quickly removed and prevented from further distribution.
The petition says many people have suffered human rights violations because Meta refused to take down Facebook posts that violated the bill of rights even after making reports to the company.