Google has published an article on how Maps reviews work. In particular, how the company develops and implements rules, and what role people and algorithms play in content moderation.
Create rules and enforce them
Google noted that, like everything else in the world, the rules and protections in Maps also evolve and do not stand still. Regular reviews help the company adapt its rules to better protect local businesses from baseless accusations.
For example, when governments and businesses began requiring proof of COVID-19 vaccinations, Google implemented additional safeguards to remove reviews that criticized companies for their safety policies or compliance with government vaccination orders.
Google also said that once a rule is created, it becomes learning material for both employees and machine learning algorithms.
Review moderation
This process is based on machine learning. Responses left by users are redirected to the Google moderation system immediately after submission, says SearchEngines. If no violations are found, the review is published. This usually happens within seconds.
To determine if a review may violate the rules, Google’s moderation systems evaluate it from several angles:
- Whether the review contains offensive or irrelevant content.
- Whether the account that left the review was seen engaging in suspicious behavior.
- Whether there has been unusual activity in relation to the place or business in question. For example, an abundance of reviews in a short period of time, or recent media coverage of the company/place, which may encourage people to leave false reviews.
Involvement of Google employees in review moderation
Moderators are responsible for conducting quality checks and additional training to remove bias from machine learning models. They also check user-tagged content.
When Google finds fake reviews, it removes them, and in some cases also suspends the accounts that left them.
Proactive protection measures
Google systems continue to analyze reviews and monitor for abnormal patterns even after reviews have been posted, notes NIX Solutions.
In addition, the moderation team is working to identify potential abuse risks.
“For example, when an event with a large audience, such as an election, approaches, we implement increased security for places associated with this event and other nearby businesses that people can search for on Maps,” the company explained.