Across Kenya’s cyberspace, statistics show that at this minute, a child is likely being groomed for sexual exploitation and abuse. The abuser, preying on her naivety, will flatter her with a torrent of compliments day after day and, finally, when he has primed her, ask for a nude photo.
Should she send the photo, it will mark the beginning of a painful chapter of coercing and blackmailing for sexual purposes. A few years ago, it happened to over 21,700 children, according to the “Disrupting Harm” report of 2022.
The report detailed cases of possession, manufacture and distribution of child sexual abuse materials that included pornographic content for distribution in the country.
“The most popular device to access the internet among 12-17-year-olds was the smartphone, with the most popular place to access the internet being cyber cafes,” notes the report.
Online child sexual exploitation is, however, not peculiar to Kenya. Policy shapers worldwide are now pushing social media giants to close the gaps on their platforms that abusers exploit to stem the now-rampant vice.
Slightly over a fortnight ago, the Senate Judiciary Committee in the US gave a group of prominent social media bigwigs a tongue-thrashing, pressing them on shortcomings related to the safety of users on their platforms.
In a highly charged three-and-a-half hour session, members of the powerful committee castigated the five tech leaders from Meta, TikTok, X (formerly Twitter), Snap and Discord, who run online services that are very popular with teenagers and younger children, for what they termed as prioritising profits over the well-being of users.
Days later, Meta announced it was unveiling a tool that would allow users to take charge of their intimate imagery and prevent it from spreading.
The tool, dubbed ‘Take It Down’, uses a hashed coding system to identify explicit images, remove them from the web and prevent them from being re-posted.
The Facebook parent firm said that the tool, which builds from the success of other platforms that help prevent those seeking to exploit people from sharing adults’ intimate images online, is a first-of-its-kind programme from the National Center for Missing and Exploited Children.
“The programme was first launched in English and Spanish but is now expanding to many more languages, making it accessible to millions of teens worldwide,” said Meta.
The firm has also partnered with technology company Thorn to update their Stop Sextortion hub, offering new tips and resources for teens, parents and teachers on how to prevent and handle sextortion.
On X, (formerly Twitter), the child sexual exploitation policy states that any slight detection of content depicting or promoting child sexual exploitation, including links to third-party sites where similar content can be accessed, is immediately pulled down without notice.
“X has zero tolerance towards any material that features or promotes child sexual exploitation, one of the most serious violations of the rules,” writes X.
Video-sharing platform TikTok says it does not accommodate accounts belonging to underage users, noting that its safety team has been trained to identify and deactivate them.
Locally, experts are calling for intensified awareness campaigns and parental guidance to shield children from online predators.
“The push spearheaded by the US will cause social media platforms to tighten up security and parental controls on the platforms. But besides that, parents and guardians should up their efforts in educating their children about the potential risks of using social media,” opines social media commentator Egline Samoei.
Her sentiments are shared by digital marketing strategist Nyandia Gachago, who calls for tightening rules in the local context and borrowing from global best practices from leading economies such as the US.
“If the US tightens the rules for everyone, it’s like tightening a nut that catches online dangers worldwide. This will make our online spaces safer, but we also need our local rules to be in sync for better protection,” states Ms Gachago.
In Kenya, Section 37 of the Computer Misuse and Cybercrime Act of 2018 criminalises the publication and dissemination of intimate images of third-party persons, prescribing a fine not exceeding Sh200,000 or imprisonment for a term not exceeding two years.