Having your website structured so that it may be indexed by Google is a crucial factor when it comes to SEO. Without your site being indexed you become invisible in Google’s eye’s, which in return means you’re invisible to your target consumer. With Google being the reigning champ of the search engines, it is wise to play by their rules.
I will break down a few different variables that will explain the reason why a website may not be indexed, and gain organic traffic.
- The Website is being indexed with a www. or a non-www. domainThink of www. as a subdomain. With that in mind… A domain such as https://test.com is not the same as https://www.test.com. In this instance, you want to make sure that both domains are added to your Google Webmaster Tools account. Doing this will ensure you that both of them are being indexed. Still set your preferred domain, but you will need to verify ownership of both of them.
- Your website has not been found by Google yet.
Though this is generally a problem with new websites, you should give it a couple of days at the very least. If Google is still not indexing your site, you may want to ensure that the sitemap for the website has been uploaded and is functioning properly. Obviously if you have not submitted or crafted a sitemap, this may be your issue! You can also request Google to crawl and fetch your website. You can see how to make such a request here.
- Robots.txt Blocking The Site or Page(s)
You could run into the issue of a developer or editor blocking the site with robots.txt. This is a rather easy fix. All you have to do it remove the entry that lays within robots.txt, after that your website should pop back into action and get indexed. You can feel free to learn more about robots.txt here.
- Crawl Errors
In certain scenarios, Google may not index some of the pages on your website because it cannot crawl them. Google can still see the pages, but can’t crawl them..To find the problem, go into Google Webmaster Tools > Choose your website > Click into “Crawl” > You will see “Crawl Errors” go here. If Google is having a hard time crawling any pages it will list them out in “Top 1,000 pages with errors.”
- Too Much Duplicate Content
If your website has too much duplicate content you could begin to confuse the search engines and in return, they will give up on trying to index your website. For example, if you have URLs that are returning identical content, you will now have a duplicate content issues. A solution to this issue is to pick one page and 301 the rest of the pages with duplicate content.
- Privacy Settings are Turned on
If you operate in WordPress, there is a chance that you could still have the privacy settings on.Take a look: Admin > Settings > Privacy.
- Website is Being Blocked by .htaccess
The .htaccess file is a factor in your website’s existence on the server, .htaccess allows your website to be viewable on the world wide web. Apache is used to write the .htaccess file. Even though it can be a useful and handy tool, it can also be used to block indexing and crawling.
- Website Takes too long to load.
Nobody likes a website that takes forever to load. Neither does Google, if the crawlers run into unreasonable load times, it is possible that Google will not index the site at all.
This is your worst nightmare. If Google smacked you with a penalty and removed you from their index. If your website has a sketchy history that you’re not aware of, it could be that manual penalty lurking around and preventing Google from indexing you.
If Google has dropped you from their index, it is a rough road ahead of getting back into it.
All in all, Indexing is a main key factor in SEO. If you’re not being indexed, then finding out why should be your number one priority.
Written by: Zack Rivera – Marketing Coordinator