The Basics Of Penguin 2.0:
According to Matt Cutts,
the new version of Penguin is primarily designed to penalize websites
that utilize black hat SEO techniques and reward websites that offer
genuine value. Webmasters who create content that people naturally want
to share and websites that visitors want to explore and return to
shouldn't be affected negatively. He also stated that Penguin 2.0 should
help many small to mid-sized businesses that play by the rules and
avoid black hat SEO. It should also help authority sites gain traction
while decreasing the rankings of sites that merely provide generic
fluff.
The ultimate goal is to cut back on link spamming and hacking, while
providing webmasters with the tools to fix hacked sites. Since "cluster
results," where a cluster of results from the same website have been
somewhat of a problem, they have also addressed this issue.
Consequently, search results should be more balanced where one
particular site doesn't dominate the first page of a search term.
They are also targeting certain search queries that have a reputation
for being affiliated with spam (for example, "payday loans"). Sites
that go overboard with advertisements or try to sneak them in under the
nose of visitors may also be penalized.
In reality, it doesn't appear that Google is reinventing the SEO
wheel with Penguin 2.0, but simply tightening their algorithm to provide
users with the most high-quality and relevant content possible. Of
course, this isn't fool-proof and some legitimate webmasters may get
caught in the crossfire.
Here are some factors to consider when building links in the new, Penguin 2.0 environment.
Link Value
Although it's helpful to have a large volume of links, it won't do
much good if they're from "bad neighborhoods." That's why it's so
important to focus on acquiring links from reputable sites. One great
way to do this is via guest blogging. As long as the vast majority of links are from trusted sites, they should act as a shield that protects from future updates.
Otherwise, an abundance of links from bad neighborhoods that use
manipulative techniques can have a negative impact. If you're unsure of a
website's credibility, use as tool like PR checker .
This simple tool will quickly display a domain's page rank. Websites
with a page rank of 4 or higher are generally suitable for back links,
but the higher the better.
However, Page Rank isn't always fool-proof. It's best to look at the
content on the website and evaluate how useful, relevant, and
interesting it is. Also, check the website's social channels like
Facebook and Twitter, and see how many followers they have in each. A
higher follower count is generally a good indication of quality and
credibility.
Link Velocity
Another issue that Google has addressed is the rate at which a site
acquires links. Except for a few select cases with viral implications,
they know that quality sites usually accumulate links organically and
gradually over time. If a newer website suddenly experiences spikes
where numerous links are acquired over night, this serves as a red flag
to Google, making it more likely the site will get "sandboxed." Search engine Watch demonstrated this phenomenon on a line graph where any more than 75 links a day were ignored by Google.
For this reason, it's best to be somewhat conservative in a link
building campaign and not create huge quantities at one time. Instead,
it's better to space them out over time in a more natural manner.
Basically, the velocity should be consistent or increase slightly over
time.
Avoid Exact-Match Anchor Text
After analyzing a ton of data, Google and most SEO professionals have
recognized the correlation between exact match anchor text and web
spam. Accordingly, they have taken measures to penalize sites that have
excessive links with exact match keywords in the anchor text. Since this
is likely to keep tightening in the future, it's smart to keep this
practice to a minimum. If your site has an excessive amount of links
with exact match keywords in anchor texts, it's a good idea to edit
those keywords so that they are not exact matches.
Instead, use sentence fragments and branded anchors. I recently wrote
an overview of all the different types of anchor text, along with an
analysis of each one and recommendations for how to properly use anchor text in a penguin 2.0 environment
Link Relevancy
As most webmasters with basic SEO knowledge already know, relevancy
plays a big role in link quality. Links from completely irrelevant sites
can hurt a site's rankings (or, at best, provide negligible value),
while links from highly relevant sites should help as long as they're
reputable. If you've been accumulating links from a hodgepodge of sites
that have nothing to do with your industry, this could account for a
decrease in rankings from Penguin 2.0. Keep this in mind in future SEO
campaigns.
Here are some ways to improve your rankings if you've been adversely affected by Penguin 2.0.
Step 1. Understand Your Link Profile
One of the most effective ways to recover from Penguin 2.0 and
protect yourself from future issues is to have a full understanding of
your link profile. Start with a link profile audit
to identify bad links which could have caused your website to get hit
by Penguin 2.0. If you're the do-it-yourself type, try data tracking
tools like Majestic SEO and Open site Explorer.
These platforms are designed to create an in-depth picture of your
link profile. Some common features include back link reports, inbound
link analysis and daily rank tracking. While this wasn't all that
necessary a short while ago, these tools are becoming more and more
important. After understanding your link profile, you can take the
necessary steps to solve any problem areas.
Step 2. Run Back link Checks on Sites Linking to You
In addition to checking the back links of your website, it's a good
idea to investigate the primary sites that link back to you. This can
also be done via Majestic SEO, but there are other tools like Back link Watch and Analyze Back links
that are also effective. This is important because if a particular site
is getting links from bad neighborhoods, it will lower their link
equity. In turn, this can have a negative impact on your SEO. If you find a site with poor link equity, back links from this site should be removed.
Step 3. Remove and Disavow Harmful Links
By all accounts, the consensus is that a website can recover and
improve its ranking in time through the removal and/or disavowal of bad
inbound links. Since Google's algorithm will eventually re-crawl and
re-index content, a website can reclaim its position in the rankings, in
most cases. While this process is usually frustrating and often
time-consuming, it's necessary to get rankings back on track and climb
the SEO ladder once again.
But the question is, how do you know which links to remove or
disavow? You can either get assistance from a professional SEO firm to
analyze your link profile and provide a spreadsheet of which links to
remove, or you can try following this step-by-step guide.
Step 4. Build New, High-Quality Inbound Links
If you've been hit by Penguin 2.0, the best way to prove to Google
that your website belongs in the rankings is by getting other credible,
high-quality websites that Google trusts to vouch for you. You can do
this by getting inbound links from these websites. There are lots of
ways to ethically build high-quality, powerful links, but my favorite is
through guest blogging. If guest blogging isn't an option, then here
are 8 other ways t build links.
Conclusion
Penguin 2.0 is a tightening of the algorithm Google originally
launched back in April of 2012. The principles are the same, as are the
goals Google is trying to achieve with the release of the next iteration
of it. If you've been hit by Penguin 2.0, follow the steps above to
recover from it, and be sure to tread carefully as you move forward with
your SEO initiative. Don't go for the short-term gain if it sacrifices
your brand in the long-term. Otherwise, Penguin will be making a very
unpleasant visit to your website.