Facebook adds Instagram data to content moderation transparency report

Canada

FILE PHOTO: Stickers bearing the Facebook logo are pictured at Facebook Inc’s F8 developers conference in San Jose, California, U.S., April 30, 2019. REUTERS/Stephen Lam/File Photo

(Reuters) – Facebook Inc (FB.O) released its fourth report on enforcement against content that violates its policies on Wednesday, adding data on photo-sharing app Instagram and content depicting suicide or self-harm for the first time.

Pro-active detection of violating content was generally lower on Instagram than on Facebook’s flagship app, where the company initially implemented many of its detection tools.

For example, the company said it proactively detected content affiliated with terrorist organizations 98.5% of the time on Facebook and 92.2% of the time on Instagram.

Facebook said it had removed about 2.5 million posts in the third quarter that depicted or encouraged suicide or self-injury. The company also removed about 4.4 million pieces of drug sale content during the quarter, it said in a blog post.

Reporting by Akanksha Rana in Bengaluru and Katie Paul in San Francisco; Editing by Maju Samuel and Lisa Shumaker

Products You May Like

Articles You May Like

Prepare to study in Canada
Study permit: Who can apply
Canada Begins a New Process to Accept Skilled Workers as Permanent Residents
Canada Express Entry Draws To Target 82 Occupations In 5 Fields Starting In Summer 2023
IRCC Announces New Measures for Family Reunification
Canada Invests Almost A Quarter Of New Settlement Funds In S.U.C.C.E.S.S
Quebec To Boost French-Language Instruction With New Francisation Quebec Organization

Leave a Reply

Your email address will not be published. Required fields are marked *