//Skip to content
Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors

How Facebook Overlooks Egypt’s Female Genital Mutilation Epidemic

March 17, 2023
Photo credit: REUTERS

Growing up in Egypt, female genital mutilation (FGM) was to me, a woman, a well-known, much-feared and widely discussed reality for Egyptian and African women. The word conjures up so many public service announcements warning against the dangerous practice. It elicits vivid and quite graphic pictures of the operation gone wrong and the many headlines of parents and legal or illegal practitioners being sentenced over the death of yet another young girl undergoing FGM.

To most of us women living in various African countries, the threat of FGM is real; it is common and quite literally — as well as metaphorically — scarring.

But something so obviously wrong, and so effortlessly recognizable as a crime against women and children, was not as obvious to spot and categorize as going against Facebook’s community standards to Meta’s content reviewers and the artificial intelligence (AI).

When I tried to report comments supporting FGM and promoting illegal clinics performing it — which goes against two community standards listed; namely inciting violence and endangering children — Facebook did not agree. And this is when the penny dropped: how can a platform so popular be so oblivious to a practice performed in a community that represents more than 9 percent of its users?

More than 200 million girls and women around the world are affected by FGM; and this isn’t simply a matter of an old practice dwindling down. Today, around 55 million girls in Africa under the age of 15 have undergone FGM or are at risk.

It is practiced in around 29 countries — despite it being criminalized in 22 of those — but in some countries it is more sinister than others. In Somalia, the staggering prevalence is 98 percent. In sheer numbers due to its large population, Egypt reigns: this very real threat has affected a total of more than 27 million women.

The numbers are there. They are clear as day. The threats and life-long side effects the practice has on women are well-known: from chronic pain to depression and life-long trauma that affects women’s sex lives and drive for good, and all the way to death due to complications. The risks and threats are exacerbated by the fact that in most countries, the procedure is conducted illegally with zero professional or governmental control or supervision, by unlicensed practitioners in shoddy underground clinics.

But to the creators of the Facebook community standards and its reviewers, evidently, none of that seems clear, evident or a reason to worry.

I came across a random, and quite obviously spam, comment on an unrelated Facebook post supporting and promoting FGM. I went to the original page posting the comment and it was more of the same: post after post promoting all the merits of a practice that has been proven over and again to be unnecessary, unfounded, and quite harmful to women. The posts even promoted illegal clinics where parents can take their young girls, unwillingly, to undergo the procedure by unlicensed practitioners, or licensed medical professionals gone rogue.

Meta clearly states that they “aim to prevent potential offline harm that may be related to content on Facebook.” Security is one of the four pillars of their community standards, and they claim they will “remove content, disable accounts and work with law enforcement when we believe there is a genuine risk of physical harm or direct threats to public safety.”

So like any good Facebook citizen, I earnestly reported the post, and the page, as contradicting community standards. Surely a post promoting a practice that is considered a form of gender based violence and a violation against human rights by many international organizations like UNICEF and UN Women will be taken down immediately and counts as a threat to security; children’s security no less.

Surely a page inciting violence against women and young girls will be suspended, or at least warned. Wrong. I received a message from Facebook stating that the page does not contradict its community standards. So I appealed the decision, hoping for a more thorough review of the posts by this page. The appeal was also rejected.

That post, and the response I received from Facebook, sent me down the rabbit hole, digging for similar content on Facebook and Twitter, the two most popular social media platforms in the region where FGM is widespread.

What I found was horrifying. There were posts. There was a flood of false information. There were even hashtags. The basics of media literacy in the age of social media entails knowledge that there is a dark side of the web and its social platforms. But social media platforms recognize that dark side, and seem to be doing a fabulous job banning posts that go against their political economic interests.

In Africa, there are more than 250 million Facebook users, with some estimates even putting it at 350 million. This represents around 13 percent of the total number of Facebook users around the world. It seems irresponsible that Facebook does not educate itself or its reviewers on an issue that affects and threatens a community that represents 9 percent of its users.

For the people at the back of the Facebook offices who did not hear, FGM can result in an array of complications, including hemorrhage and infection that very often lead to death. It also causes long-term consequences that include complications during childbirth, anemia, cysts, keloid scars, damage to the urethra resulting in urinary incontinence, painful intercourse and other sexual dysfunctions, and a sea of psychological effects that include depression and suicidal thoughts. New studies have even linked FGM to complications at childbirth and a higher maternal mortality rate.

FGM is a life risk for women. It is as simple as that. The United Nations Population Fund (UNFPA) estimates that between 2015 and 2030, 68 million girls will be cut.

The least Facebook could do as one of the largest social media platforms in the world — and consequently one of the most powerful and impactful media platforms — is to be part of the solution, to make making a conscious effort to teach its artificial intelligence and its reviewers to combat such violent, dangerous and widely accessible posts.

The opinions and ideas expressed in this article are the author’s and do not necessarily reflect the views of Egyptian Streets’ editorial team. To submit an opinion article, please email [email protected].

 

Comment (1)