Home Internet Marketing Services Crawl Errors that are Hurting Your Search Engine Rankings

Crawl Errors that are Hurting Your Search Engine Rankings

As a site owner, you may have come across some server and DNS errors. These errors may seem harmless to you but they’re actually doing terrible things to your search engine rankings. Playing ignorant will only make the situation get worse and you may wake up someday to find your site virtually nonexistent to search engines. For today we’re briefly going over some of the various Crawl Error and what could cause them. Crawl Errors are basically of two types: Site Errors and URL Errors

Site Errors

If Google crawls your site and discovers more than the healthy amount of errors on your site, you will be notified via a message your account. You will then have to look at your Site Errors section on the Crawl Errors page. Here you’re going to find error codes if the three basic errors type comprising Server Connectivity, robot.txt fetch, and DNS error. A green check beside each error indicates they’re in a normal state. Otherwise, you’ll see a box click on it to get a detailed 90-day crawl report.

Crawl Errors that are Hurting Your Search Engine Rankings

Crawl Errors that are Hurting Your Search Engine Rankings

DNS Errors

Although a cording to Google, DNS error will not affect how they connect to your site, you have to take dells with and drastic actions to tackle DNS errors. You need to treat this error as important because it holds the key to accessing your website.

Using tools like Google’s Fetch As Google,, and can assist you to tackle and solve all DNS errors on your site.

Server Errors

If you are experiencing this type of error, it’s most likely because your site is busy, the search engine can’t access your site or the request for your site timed out. This error is often times due to technical issues on the server’s – the server on which your site hosted – end. Other times it partly due to connectivity issues that you can easily remedy.

  • Keep the loading time for dynamic page request short
  • Confirm you’re not unknowingly blocking Google or the search engine from accessing your site
  • Ensure your server is correctly configured and connected to the internet

Robot Error

First, we have to look at the robot.txt file. What does it do? Robot.txt files are created by the site owner if he doesn’t want certain pages on his site to crawled and indexed by Google. The restricted pages links are pasted in the robot.txt file. Before Google crawls a website, it looks in the robot.txt file for the pages that will not be crawled.if this file exists but can not be accessed, no crawling will be executed on the whole website. Crawling will be shifted until a time when access to the file is possible. If this file doesn’t exist, Google will crawl the entire site.


They are very different from Site Errors. How? They only affect specific pages on your site and not your entire website as Site Errors do. These errors include:

404 Errors

This occurs when search engines crawl a page on your site only to discover the page doesn’t exist. Although Google says this about 404’s: “Generally, 404 errors don’t affect your site’s ranking in Google, so you can safely ignore them.”, you can’t.

You can’t ignore them. They may not be a major SEO problem but they are still issues you need to attend to. Most especially if the pages returning 404 errors are important pages carrying important information about your company customers need to know. If that is the case do any of these to fix the problem

  • Confirm the page is published and not saved as a draft or deleted.
  • Confirm the link to the page with the error is the correct link
  • 301 redirect the page to another if reviving this page is not your intention

Access Denied

Google Crawlers will be denied access to your site when crawling if:

  • The Robot.txt file has blocked access to that page or your entire site
  • User is required to log in before they are granted access to your site

To fix this error, it is required that you

  • Confirm the pages in your robot.txt files aren’t meant to be crawled
  • Stop the login page from coming up on pages you want Google to crawl.
  • Use a tool like Screaming Frog to scan your site for pages that require a log in before A granted access

These errors, while uncommon, will cause your search rankings to fall. Keep them in your sights.

Not Follow

This error is sometimes confused with the” nofollow” directive. Special care should be taken to distinguish between these two. The “not follow” error occurs when Google can’t follow the link in question. Often times it’s as a result of Flash and Javascript not responding to Google.

True, these are very technical and don’t sound like real SEO stuff. But don’t let that deceive you, all these “technical terms”, error and issues, play vital roles in your search engine rankings. Knowledge of these errors and how to solve them will take you to heights far above your competition.

 Share on Facebook Share on Twitter Share on Reddit Share on LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *