Facebook CEO, Mark Zuckerberg.Facebook CEO, Mark Zuckerberg.

Ololade Adeyanju/

Facebook has taken action on 837 million pieces of spam and shut down 583 million fake accounts on the site in the first three months of 2018.

The company disclosed this in its first quarterly Community Standards Enforcement Report.

Overall, the company said it took moderation action against 1.5 billion accounts and posts for violating its community standards, within the period.

Facebook also moderated 2.5 million pieces of hate speech, 1.9 million pieces of terrorist propaganda, 3.4 million pieces of graphic violence and 21 million pieces of content featuring adult nudity and sexual activity.

Several categories of violating content outlined in Facebook’s moderation guidelines – including child sexual exploitation imagery, revenge porn, credible violence, suicidal posts, bullying, harassment, privacy breaches and copyright infringement – are not included in the report.

Facebook also managed to increase the amount of content taken down with new AI-based tools, which it used to find and moderate content without needing individual users to flag it as suspicious.

The company said the tools worked particularly well for content such as fake accounts and spam and it managed to use them to find 98.5 percent of the fake accounts it shut down, and “nearly 100 percent” of the spam.

The company also announced measures that require political advertisers to undergo an authentication process and reveal their affiliation alongside their advertisements.

Last month, YouTube also revealed it removed 8.3 million videos for breaching its community guidelines between October and December.

0

By Editor

Leave a Reply

Your email address will not be published. Required fields are marked *