headerIMG

B2B Articles - Mar 14, 2012 12:26:34 AM - By Ironpaper

Crawl errors revisited for webmasters

Google just updated their Webmaster Tool crawl errors, which inform designers and webmasters of problems Googlebot encounters when trying to crawl a website. This tool has become a staple for many web designers that need insight into problems that potentially could hurt search rankings or site visibility.

With the recent update, the crawl errors have been updated into two categories:

  1. site errors
  2. URL errors

Site errors are categorized as:

  1. DNS errors
  2. Server Connectivity problems for a website
  3. Robots.txt Fetch

URL errors are categorized as:

  1. Server error
  2. Access denied
  3. Soft 404
  4. Not found
  5. Not followed
  6. Other (As in, Zombies got there first)

Also, Google now shows trends over the past 90 days for each type of error within Webmaster tools. Google is also listing URLs in priority order--ranked by numerous factors.

Unfortunately, the updates to Webmaster Tools has resulted in some serious missing functionality, which may or may not be an accident. This loss of functionality includes a decrease in the amount of URLs that you can download, specifics about soft 404s, URLs blocked by robots.txt and more.