What are the Google Quality Search Rater Guidelines?
They are a checklist given to Google independent contractors, who are tasked with scoring the search results that Google returns for certain search phrases.
If you want to see the Search Quality Rater Guidelines, there’s a link here. They were last updated publicly on July 20, 2018.
These human evaluators look at search results, and grade the pages that appear, using detailed criteria. There are about 10,000 contractors worldwide that Google uses to do this work.
What does Google do with this information?
The Search Quality Raters do not directly affect search rankings, but they give feedback to the Google engineers, who then adjust the ranking algorithm. The idea is to have the artificial intelligence choose the same results that a human being would.
Some articles have been circulating around the SEO world in the last few days. Apparently, you can detect if and when your site was visited by the Search Quality Raters by looking for specific URLs in Google Analytics.
Internet marketer Matthew Woodward put together a custom Google Analytics segment that you can import to look for these URLs.
If the Search Quality Raters haven’t worked behind a proxy (which would strip out the referring URL), these are the companies to look for.
Is There Something To This?
Matthew makes a compelling case based on 30 sites where he has seen one or more of these referrers visit a website. In each case, he says the traffic went up or it went down shortly thereafter.
The theory is, if the Search Quality Raters visit your site, the Google engineers adjust the algorithm accordingly.
Again, it’s super important to remember, the human Search Quality Raters do not directly affect or “tweak” your rankings. There’s simply no way to do that at scale for every site on the internet. Also, this is on a page by page basis. The Search Quality Rater team isn’t judging your entire site when they visit. They are looking at specific search results pages, and evaluating what shows up there.
Here’s what I think is happening.
The Google engineers take feedback they get from the Search Quality Raters, about a multitude of search results, and “close holes” in the algorithm by making adjustments to how certain factors may be weighted.
Let’s remember, too, Google reps have stated the engineers make changes to the algorithm about five or six times a day. Most professional SEOs believe that Google wants RankBrain (the algorithm AI) to make choices like a human, but it is still early in the process. There’s a lot of nuance to how humans think.
I looked up someone I did a SEO audit for last year, who had some traffic drop off last year. Sure enough, there was a visit from Raterhub a few weeks before their traffic started dipping.
Now, there were other factors that were in flux, and it looks like the Search Quality Raters only visited one page. If there were more visits, they may have been proxied.
One case is not enough to draw any conclusions.
My feeling is you shouldn’t wait for the Search Quality Raters to come to your site to spring into action. You should be proactive about improving your traffic, even when you’re on top.
Here’s My Two Cents
Continually analyze what Google is already ranking. How can you make adjustments to fit those patterns? How can you make Google see your pages as the best fit for a search query.
SEO is about continual, incremental improvement. It’s never a “one and done” event.
Work every week at improving your best performing content, your overall site, your backlink profile, and your brand, and you’ll grow your traffic.