Quantcast
Channel: Web Development Secret Tips » website development
Viewing all articles
Browse latest Browse all 10

Google Penalties – Are They Real

$
0
0

Every SEO and website forum on the web carries posts about the infinite range of Google penalties. A quick search brought up -3,-6,-30,-500, although there are many more. Penalty phrases have been coined to suit just about every change in search engine positioning imaginable, they can not all be true.

Google do impose penalties, or deindexing, on websites that break their guidelines but they also change their algorithms. Equally, other sites around the one in question may improve their optimisation techniques, or see an influx of worthwhile inbound links. Small changes website owners make can have a disproportionate effect on search engine indexing, even if this doesn’t seem likely. Many readers of this article may know of a large site that dropped badly and spent months trying to work out why Google had penalised them. The eventual answer did not relate to a penalty, simply to a few changes in internal linking, which they could not imagine would have such an effect.

If your website suddenly slides down the rankings, assuming a penalty is a wrong move. Easy to focus on any fractional reason you might have offended Google, spend days emailing to ask for links to be removed, strip out sections of a website that help earn revenue, tear apart valuable pages, when none of this is doing any harm. Of course Google guidelines need to be considered, if you are knowingly breaking them, this needs attention but too many website owners assume the worst. There are other common areas to consider:

1. Consider any recent changes to the site. If they could possibly have an effect, try reversing them. Give every encouragement to robots, such as a fresh .xml sitemap, or a few new links. Then see if there is a positive effect.

2. Analyse robots.txt files carefully. Even if they have stayed the same, the way errors are allowed for could have been reinterpreted by search engines. The aim of good robots is to comply with requests not to index and their is clear evidence that when confronted with a doubtful situation, they may now choose not to index, rather than risk a mistake.

3. Apply the same to .htaccess files. Errors in these can send a robot on a wild goose chase and give them less reason to return too often. An incorrect choice of redirect, such as a 302, when a 301 is needed, can make a major difference to indexing.

4. Run the pages through a validator. Perfect code is not essential for good indexing and errors that prevent this are rare but they do happen. A simple extra quotation mark in the wrong place can break a page for a robot.

5. Check your site for broken links, a tool such as Xenu will save time. Also consider whether internal linking and the choices of pages to be indexed best serve your visitors, human or virtual. If a few of your pages add nothing to the experience of either, they could be diluting success.

The last of these points would require a change in how search engines see a website and this is a natural part of search evolution. Algorithm updates are frequent and how these change SERPs positioning is a more likely cause of a drop than direct penalties.

Unless you are aware of clear reasons why a search engine would choose to penalise your website, do not waste days agonising over every possible cause of something that may not be real. Ensure your website is technically up to scratch, go back to the basics of SEO. Varying, relevant titles, good headers and alt tags, fresh, meaningful content, reasons for other good sites to offer freely given, inbound links. If you drop, focus on going forward, far more productive than chasing ghosts.


Viewing all articles
Browse latest Browse all 10

Trending Articles