Meta’s main subcontractor for content moderation in Africa, Sama, will retain its BCorp certification until the case against it in Kenya, over claims of union busting and exploitation, is determined. The referenced case, which also includes allegations against Meta, was filed in May this year by Daniel Motaung, a former content moderator in the East African country.
The corporate responsibility group, B Lab, told TechCrunch that the decision to uphold Sama’s certification was made after its standards management team concluded an initial review of allegations against the company, as captured in a Time magazine article, and following similar complaints received through its complaint process.
The BCorp status is a stamp of approval for companies seen to meet high standards of transparency, performance and accountability, taking into account several factors including employee welfare, company structure and work process. The status is arguably one of the reasons Sama says it’s an ethical AI company.
“In cases where legal or regulatory action is possible, B Lab does not pursue independent concurrent investigations. Our complaints process recognizes the vigor of the legal processes and relies on the outcomes of those judgments,” B Lab said.
“After the outcome of the lawsuit is available, further action against Sama may also be taken in the form of a formal investigation with a decision on eligibility by B Lab’s Standards Advisory Council. This may also require an onsite visit by B Lab to Sama offices in East Africa and interviews with content moderation employees,” it said.
The B Lab said it is holding off certifying new companies that employ content moderators, adding that it will include new risk standards that companies employing content moderators will have to meet to be considered for the status.
The new standards will include transparency, especially during recruitment, access to wellness programs, and company accountability to monitor the health of the employees.
Case files say Sama carried out a “deceptive recruitment process” by opening up vacancies that failed to mention the nature of the job that successful applicants would do at its hub in Nairobi. The moderators are sourced from a number of countries, including Ethiopia, Uganda and Somalia.
According to court records, Motaung, who was laid off for organizing a 2019 strike, and trying to unionize the subcontractor’s employees, said his job exposed him to graphic content, which has left a lasting effect on his mental health.
The moderators sift through social media posts on all its platforms, including Facebook, to remove those perpetrating and perpetuating hate, misinformation and violence.
Motuang is seeking financial compensation for himself and other former and existing moderators, and also wants Sama and Meta compelled to stop union busting, and provide mental health support amongst other demands.
Meta wants the case dropped, noting that moderators had signed a non-disclosure agreement, barring them from issuing evidence against it.