Google Scraper Report Form: Algo or Manual We Wonder?
So the latest news from Google – well, from Matt Cutts who’s the head of the Google Spam Team, is that there is a brand new Google form for anyone to use to report a scraper site ranking higher than the author site.
Huh? Well, here’s what he meant – at least as I see it.
If you write your own content, say a great blog piece on how to attract brand new truck driver candidates for your trucking firm, and then post it on your blog that’s a good thing. You’re sharing your expertise with the folks who might find that kind of an article of interest, and yes, you use Social Media via your own company pages to publicize that fresh content. Let a day or two go by and you then check that post and it’s position in the Google rankings…and hey great, you’re at #7 on the page….
But wait, you think…as you read the first 6 on that page – one of them “scraped” your site, stole your content, posted it on their own pages signifying that THEY are the authors of same. And of course, the Google bot found it and for whatever reason, ranked this scraper site higher than yours!
Unfair? Absolutely….and until now almost impossible to get any kind of justice outta Google or for that matter from the scraper site themselves.
Been there. Done that and yes I’m always so so upset that this kinda thing happens….and yes you really need to police your site ALL THE TIME.
But back to the new Scraper Report – Matt Cutts announced same via a tweet seen here -
And as you can see from the URL here, it’s a pretty simple form to fill out that allows anyone who finds any scraper site outranking the author site to report same – i.e. it doesn’t have to be your own site that got scraped….you report ‘em as you find ‘em…but I’d bet that for the most part, we’ll be looking at our own sites only!
Good. And about time – as Google did something slightly similar a couple of years back . But here’s what I’m thinking…
This form will send to Google info on the scraper site….i.e. Google gets the URL. And as I see it Google “can” use the reported scraper infringements to test their algo for changes that will result in “de-ranking” the scraper site below the author site – at the least. It may also, remove the scraper site too from the rankings for that keyword/phrase too (tho I doubt it) and lastly, it will help Google re-jib their ranking index.
That tho as I see it is the “end-case scenario” – it will need to be a manual process at first, with Googlers looking at the Reports, trying the links…then somehow passing on their findings to the spam team.
That’s a good thing, right?Google