Twitter planned to build an OnlyFans clone, but CSAM issues reportedly derailed the plan

discussed creating a clone to monetize the adult content that had dominated the platform for many years, but its inability to effectively detect and remove harmful sexual content put the brakes on that notion, according to a . A team from Twitter put together to see if the company could pull off such a move found this spring that “Twitter cannot accurately detect child sexual exploitation and large-scale non-consensual nudity.” The team’s findings were “part of a discussion that ultimately led to us pausing the workstream for the right reasons,” said Twitter spokeswoman Katie Rosborough.

Twitter reportedly halted the adult content monetization (ACM) project in May, not long after — that deal is now . The company’s leadership team determined that it could not move forward with ACM without taking further health and safety measures.

The investigation () details warnings from Twitter researchers in February 2021 that the company was not doing enough to detect and remove harmful sexual content such as Child Sexual Abuse Material (CSAM). Researchers reportedly told the company that Twitter’s primary enforcement system, RedPanda, is “an outdated, unsupported tool” that is “by far one of the most vulnerable, inefficient, and unsupported tools” it employs.

While the company has machine learning systems in place, they seem to have trouble spotting new cases of CSAM in tweets and live streams. Twitter manually reports CSAM to the National Center for Missing and Exploited Children (NCMEC). However, researchers found that the labor-intensive process resulted in a backlog of cases and a delay in reporting from CSAM to NCMEC. said Rosborough The edge that since the researchers published their report last year, Twitter has significantly increased its investment in detecting CSAM and is hiring several specialists to address the problem.

“Twitter has zero tolerance for the sexual exploitation of children,” Rosborough said. “We aggressively combat child sexual abuse online and have made significant investments in technology and tools to enforce our policy. Our dedicated teams work to stay ahead of malicious actors and ensure we protect minors from harm – both online and offline.”

Advertisers may have balked at the idea of ​​monetizing adult content (despite porn being rife on the platform), but the potential financial benefit for Twitter was clear. OnlyFans expects $2.5 billion in revenue this year, about half of what Twitter generated in 2021. Twitter provides avenues for developers to reach the large audiences that many of them have built on the platform. The addition of OnlyFans-style features might have been a gold mine for adult content creators and the company. Major issues have prevented the company from making the move despite the alleged improvements over the past 18 months.

Engadget has reached out to Twitter for comment.

All products recommended by Engadget are selected by our editorial team independently from our parent company. Some of our stories contain affiliate links. If you buy something through one of these links, we may receive an affiliate commission.

https://www.engadget.com/twitter-onlyfans-clone-harmful-sexual-content-report-164640114.html?src=rss Twitter planned to build an OnlyFans clone, but CSAM issues reportedly derailed the plan

Russell Falcon

USTimesPost.com is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@ustimespost.com. The content will be deleted within 24 hours.

Related Articles

Back to top button