Content moderation in languages outside of English has been an ongoing challenge for social media companies.

Meta is making changes in response to the findings.

Meta owns the world’s largest social connection Facebook, the photo-and-video service Instagram and the messaging app WhatsApp.

The findings outline several content moderation errors Meta made amid the Israeli-Palestinian conflict last year.

Having a classifier helps the company’s artificial intelligence systems automatically identify posts that likely violate its rules.

Meta also lost Hebrew-speaking employees and outsourced content moderation.

Meta also wrongly pulled down content that didn’t violate its rules.

The report also pointed out other major content moderation mistakes on Meta’s platforms.

Palestinian journalists also reported that their WhatsApp accounts were blocked.