Categories
Mobile Syrup

Twitter still isn’t doing enough to combat CSAM: report

Despite Elon Musk’s repeated claims that cracking down on child sexual abuse material (CSAM) on Twitter is “priority #1,” evidence continues to show that CSAM persists on Twitter.

According to a new report from The New York Times in conjunction with the Canadian Centre for Child Protection, not only was it easy to find CSAM on Twitter, but Twitter actually promotes some of the images through its recommendation algorithm.

The Times worked with the Canadian Centre for Child Protection to help match abusive images to the centre’s CSAM database. The publication uncovered content on Twitter that had previously been flagged as exploitative. It also found accounts offering to sell more CSAM.

During the search, the Times said it found images containing ten child abuse victims in 150 instances “across multiple accounts.” The Canadian Centre for Child Protection, on the other hand, ran a scan against the most explicit videos in its database and found over 260 hits, with more than 174,000 likes and 63,000 retweets.

“The volume we’re able to find with a minimal amount of effort is quite significant. It shouldn’t be the job of external people to find this sort of content sitting on their system,” Lloyd Richardson, technology director at the Canadian Centre for Child Protection, told the Times.

Meanwhile, Twitter laid off a significant number of its employees and contract workers in November 2022, including 15 percent of its trust and safety team — which handles content moderation. At the time, Twitter claimed the changes wouldn’t impact its moderation.

Later that same month, Musk granted a “general amnesty” to banned Twitter accounts, allowing some 62,000 accounts to return to the platform (which included white supremacist accounts). At the same time, reporting revealed that Twitter’s CSAM removal team was decimated in the layoffs, leaving just one member for the entire Asia Pacific region.

In December 2022, Twitter abruptly disbanded its Trust and Safety Council after some members resigned. Musk accused the council of “refusing to take action on child exploitation” even though it was an advisory council that had no decision-making power. Former Twitter CEO Jack Dorsey chimed in to say that Musk’s claim was false, but Musk only doubled down on claims that child safety was a “top priority.”

In February, Twitter said that it was limiting the reach of CSAM content and working to “suspend the bad actor(s) involved.” The company then claimed that it suspended over 400,000 accounts “that created, distributed, or engaged with this content,” which the company says is a 112 percent increase since November.

Despite this, the Times reported that data from the U.S. National Centre for Missing and Exploited Children shows Twitter only made about 8,000 reports monthly — tech companies are legally required to report users even if they only claim to sell or solicit CSAM.

You can reach the Times report in full here.

Source: The New York Times Via: The Verge

Categories
Mobile Syrup

Twitter un-bans 62,000 accounts as CSAM removal team decimated by layoffs

In today’s Twitter news, the platform has begun restoring some 62,000 accounts following new owner Elon Musk’s “general amnesty” poll. Meanwhile, reports indicate Twitter has stopped enforcing its COVID-19 misinformation policy and has slashed its CSAM to just one person despite Musk promising that removing CSAM would be Twitter’s “priority #1.”

Starting with the account restorations, Casey Newton reported via his Platformer newsletter that Twitter began the process of reinstating about 62,000 accounts with over 10,000 followers, including one account with over 5 million followers and 75 accounts with 1 million or more followers. The move, which Twitter employees have started calling “the Big Bang,” comes after Musk polled Twitter users asking if the company should grant a “general amnesty” to suspended accounts.

Platformer notes that the move could cause increased instability at Twitter as the company loses valuable engineering talent. Moreover, the Twitter team is arguably stretched across various Musk projects, such as his goal of bringing back the refreshed Twitter Blue subscription later this week with a manual verification process.

The influx of formerly suspended accounts could have other impacts on Twitter. For example, CNN reported that Twitter added a note to its COVID-19 misinformation page that it no longer enforces the policy. Between January 2020 and September 2022, Twitter suspended over 11,000 accounts for breaking COVID misinformation rules, per statistics published by Twitter. It also removed more than 100,000 pieces of content that violated the rules.

Likely, some of the accounts suspended over the misinformation policy will be among the ones Musk allows back on Twitter.

Twitter CSAM removal team down to one person

Finally, Wired reported that Twitter’s child sex abuse material (CSAM) removal team was decimated by the company’s recent layoffs, leaving just one staff member for the entire Asia Pacific region. Wired notes that it’s not clear how many people were on the CSAM removal team prior to the layoffs, but the Asia Pacific region is home to around 4.3 billion people (roughly 60 percent of the world’s population) and some of Twitter’s busiest markets. For example, there are 59 million Twitter users in Japan, second only to the number of users in the U.S.

Twiter’s CSAM removal teams work with various organizations that collect data about CSAM content. That data is used in Twitter systems that remove CSAM content, and Twitter’s internal dashboards are considered critical for analyzing metadata to help people writing code to identify CSAM networks and remove them before content is shared.

Twitter has a long-running issue with tackling CSAM, with an internal report from April 2022 saying the company “cannot accurately detect child sexual exploitation and nonconsensual nudity at scale.” The company’s CSAM struggle is made more complicated by it allowing the sharing of consensual pornography since tools that scan for CSAM struggle to differentiate between a consenting adult and an unconsenting child.

Additionally, the CSAM problem has helped push brands off Twitter — notably, Dyson and Forbes suspended Twitter advertising campaigns in September after ads appeared next to CSAM.

Needless to say, human staff are integral to the process of finding and removing CSAM material. Musk, however, has publicly asked Twitter users to “reply in comments” if they see any CSAM issues on Twitter. Experts told Wired that Musk should be having that discussion with his CSAM removal staff instead of asking Twitter users to help in the comments.

You can keep up with the ongoing saga of Twitter under Musk here.

Source: Platformer, CNN, Wired