Browse By

Google Guide: How to Take Down Your Site (SOPA Blackout)

website under maintenance

As a Webmaster, we always take down our sites for several reasons. It might be for maintenance, upgrades and many more. But from many years we are following the traditional way of showing the images of Under Construction and under maintenance.

But today we will guide you to take down your site for SOPA Blackout without losing the Google rankings and the indexed pages in the search engine.  Google Webmaster Trends Analyst Pierre Far, has put together a guide showing how to take down a site temporarily in the right way.

Suggestion : SOPA Blackout: How To Back Up Your Social Media Account Data

The most common scenario Google sees is: Webmasters implementing contents on all or some of their pages with an error message (“site offline”) or a protest message. The following points apply to this scenario (replacing the contents of your pages).

#1. 503 HTTP header for all the URL

503 Error

< credit >

Webmasters should return a 503 HTTP header for all the URLs participating in the blackout (parts of a site or the whole site). This helps in two ways:

  1. It tells Google it’s not the “real” content on the site and won’t be indexed.
  2. Even if Google sees the same content (e.g. the “site offline” message) on all the URLs, it won’t cause duplicate content issues.

Google bot’s crawling rate will drop when it sees a spike in 503 headers. This is unavoidable but as long as the blackout is only a transient event, it shouldn’t cause any long-term problems. The crawl rate will recover fairly quickly to the pre-blackout rate.

Suggestions: You can use this simple WordPress Plugin which puts your blog for maintenance mode by sending 503 Errors to all the unauthenticated users.

For manually handling the downtime of your blog, have a read here.

#2. Two important notes about robots.txt

Importance of Robot.txt

< credit >

  1. As Googlebot is currently configured, it will halt all crawling of the site if the site’s robots.txt file returns a 503 status code for robots.txt. This crawling block will continue until Googlebot sees an acceptable status code for robots.txt fetches (currently 200 or 404). This is a built-in safety mechanism so that Googlebot doesn’t end up crawling content it’s usually blocked from reaching. So if you’re blacking out only a portion of the site, be sure the robots.txt file’s status code is not changed to a 503.
  2. Some Webmasters may be tempted to change the robots.txt file to have a “Disallow: /” in an attempt to block crawling during the blackout. But this has a high chance of causing crawling issues for much longer than the few days expected for the crawl rate recovery. So we would suggest not to use “Disallow:/” for SOPA Blackout.

#3. Webmaster Tools

Google Webmaster Tools

< credit >

Webmasters will notice these errors in Webmaster Tools: it will report that Google saw the blackout. Be sure to monitor the Crawl Errors section particularly closely for a couple of weeks after the blackout to ensure there aren’t any unexpected lingering issues.

#4. General advice

General Advice

< credit >

Keep it simple and don’t change too many things, especially changes that take different times to take effect. Don’t change the DNS settings. As mentioned above, don’t change the robots.txt file content. Also, don’t alter the crawl rate setting in WMT. Keep as many settings constant as possible before, during, and after the blackout. This will minimize the chances of something odd happening.

These are the wise steps to be followed before you take down your site for any reason. Keep it simple. Keep a healthier relation between your site and Googlebot’s for better ranking.

Source

=====================================================

Share this article with your friends and make them aware of these points. Hit comment section if you have got any question. Thank you.

4 thoughts on “Google Guide: How to Take Down Your Site (SOPA Blackout)”

    Leave a Reply