Categories
Mobile Syrup

YouTube introducing new policies related to eating disorder content

Popular video-sharing platform YouTube has announced it’s cracking down on content that relates to eating disorders. As part of this, the site will prohibit content that features behaviours such as purging after eating or extreme calorie counting.

Although YouTube has always removed content that encourages or glorifies eating disorders, this new addition to the Community Guidelines is an effort to prevent “in-risk” viewers from imitating harmful content uploaded to the site.

The videos that feature harmful content related to eating disorders will not be fully removed from the platform but will require users to log into YouTube, and only those who are 18 years of age or older can access the videos. The policy changes were created in consultation with the National Eating Disorder Association as well as other non-profit organizations.

The new policy changes come in the wake of backlash towards platforms like YouTube and Instagram from certain lawmakers. In 2021, the site was called out for promoting accounts featuring content depicting extreme weight loss and dieting to young users.

This isn’t the first time YouTube has updated its policies to address specific issues, with the platform recently rolling out several updates for how it handles medical topics such as vaccines and abortions.

In addition to flagging videos, YouTube will add panels with information on crisis resources under eating disorder-related content in nine countries with more to come.

To better educate creators, if an account’s video goes against the new policy, YouTube will provide resources on how to create less harmful content.

The video-sharing company plans to roll out the new changes globally in the coming weeks and will use both automated and human moderators to analyze videos on the platform.

Source: YouTube Via: CTV News

Categories
Mobile Syrup

Twitter tests a way to let users add content warnings to specific posts

Twitter is working on a new feature that will let users add content warnings to individual photos and videos sent out in tweets. According to the social network, the feature is currently in testing and only available to select users.

So far, the only way to include a content warning in your tweets is to add one individually. This means that every picture and video you post will offer a content warning. However, the new feature will allow users to add warnings to a singular tweet or specific categories.

The video above shows that when you’re editing a picture or video, you’ll be able to tap the flag icon in the bottom right corner of the toolbar to add a content warning. Following that, users will be able to categorize the warning as “nudity,” violence,” or “sensitive.” Then once you post the tweet, the image will be blurred out with an overlay explaining why the content is flagged.

Twitter says it will also continue to rely on user reports.

Source: Twitter