One of the biggest complaints around the Penguin update is that it allowed unscrupulous SEOs to point bad links and “over optimize” their competitors links to decrease their rankings.
While the Link Disavow Tool only allows you to report links pointing back to your site through Webmaster Tools, an entire industry will spring up around existing domains or building new-sites that will aim to build links from the same sources of their competitors – and then “report” these bad links to effectively reduce the “link-juice” of their competitors.
Aggressive automated link-building tools and techniques will still have their place – in poisoning sites that have links that mirror their competition. This will most likely only occur in hyper-competitive SEO markets, but depending on how effective this strategy is, it may overflow into more traditional SEO markets.
So, What does Google Want from a Link?
That begs the question: What does Google want? How do you differentiate from poor quality links and high-quality links?
Google genuinely wants links to be a vote of confidence for a site. The concept of PageRank stemmed from the idea of “Citation” in the academic paper publishing process where one paper would “cite” another paper in its references. The more citations a paper has, the more important it is.
Google wants links that are true measures of citation. Therefore a good guideline is this:
- Good links are those by which an actual human being has taken the time to approve the link (or content containing the link in the case of genuine quality content syndication for backlink creation).
- Bad links are links that can be created automatically with no human intervention or quality review. These kinds of links are ripe for exploitation through leveraged automation through software tools.
If you haven’t already, I recommend that you read through the Google Webmaster Guidelines to get a feel of what they consider good and bad links. They’ve recently updated it to include a range of examples to make this distinction clearer.