Messages reported on WhatsApp are reviewed by more than 1,000 moderators, says website | Technology

WhatsApp has more than 1,000 outsourced employees to analyze complaints sent by users, according to a report published last Tuesday (7) by the website “ProPublica”.

The poll found that Facebook, owner of the messaging application, has teams in Austin (United States), Dublin (Ireland) and Singapore to review text messages, images and videos that may violate its rules.

According to sources heard by ProPublica, each employee analyzes approximately 600 complaints a day, which gives less than a minute, on average, for each of them to be checked.

The analyzes involve complaints for reasons such as fraud, spam, child pornography and terrorist conspiracy on WhatsApp.

Due to the application’s end-to-end encryption, moderators can only review messages shared by users via the “report” option available in conversations.

  • WhatsApp is fined R$1.3 billion in Europe for breaking data protection law
  • Is your WhatsApp account protected? Take the test and find out

How does WhatsApp moderation works

The report pointed out that WhatsApp moderators analyze “reactive” cases, initiated by users, and “proactive”, brought together by the application’s artificial intelligence.

In “reactive” cases, after users report a contact, moderators access the message that allegedly violated the rules and the four previous messages, including images and videos.

In “proactive” cases, the service looks for suspicious patterns, such as sending too many messages to a new account.. The tool checks other data that is not encrypted, such as group names and photos, phone numbers, profile photos, history of account violations, among others. In this case, the service does not access the content of the messages.

When reviewing a report, moderators choose between three options: do nothing, keep user under review or ban account.

Also according to ProPublica, WhatsApp stated that employees are not content moderators because they do not interfere with the content of the conversation. For the platform, the model is different from what is adopted on Instagram and Facebook, where moderation can take posts off the air.

Wanted by G1, WhatsApp said it provides a way for users to report spam or abuse on the platform, which includes sharing some messages from a conversation.

“This functionality is important to prevent the worst abuses on the internet. WhatsApp strongly disagrees with the idea that accepting complaints that users choose to send to the app is incompatible with end-to-end encryption,” the service said in a statement.

Learn how to protect yourself from scams on WhatsApp

Whatsapp scams: know how to protect yourself

Whatsapp scams: know how to protect yourself