ONLY ACCEPTING 2 MORE CLIENTS THIS MONTH

Book SEO Strategy Call

7 potential reasons why Google isn’t indexing your local business website

July 19, 2022 0 comments

After a website is crawled by search engine crawlers, getting indexed is the second big step towards building an organic traffic stream. If your web pages are not getting indexed by Google, you won’t be receiving organic traffic. It is as simple as that.

Sometimes, you may think that you have done everything right. But Google is still not indexing your web pages.

If you are facing a similar situation, here is a list of potential reasons why Google may not be indexing your website.

Identifying, confirming, and fixing these issues will likely lead Google to start indexing your web pages on the SERPs.

1. Robots.txt

First, the most common and obvious reason — the robots.txt file on your website may be blocking Google from indexing the pages on your site.

In case you don’t know, robots.txt allows you to inform Google which pages it should index or not. It is not uncommon to make an error and disallow important pages on your website in the robots.txt file.

The robots.txt file looks like this.

search-engine-crawl

Note that there is nothing after “Disallow:”.

That means that nothing is disallowed, and search engine crawlers are allowed to index every single page on your website.

However, even a simple forward flash could change that.

indexing-local-business

The forward slash means that the root folder of your website has been disallowed, which means that search engine crawlers are not allowed to index any page on your website.

Make sure that the robots.txt file on your website is exactly how it should be.

2. The Noindex, Nofollow meta tags

Other than robots.txt, there is another method to prevent Google from indexing web pages — by using the Noindex, Nofollow meta tags.

Unlike the robots.txt file, Noindex, Nofollow meta tags work on a page-by-page basis. Unless you want Google not to index a certain page (for example, a thank you page that appears after an e-transaction is complete), make sure that the pages do not have the Noindex, Nofollow meta tags.

Check the HTML source code of the page and search for keywords such as “noindex” and “nofollow” and change them so they read index,follow instead.

3. Redirect loops

Pages with redirect loops are not likely to get indexed. That’s because these pages send search engine crawlers on a loop.

Check the HTML source code of the web page and see if you can find a duplicate URL pointing back to itself.

Sometimes, these issues do not appear in Google Search Console. In that case, you may want to use an external site audit tool, such as Screaming Frog, SEMRush, or Ahrefs.

Remove the redirect, re-submit the page for indexing in Google Search Console, and check again in a few days. The Google Search Console should show the green indexing status.

4. Crawl depth

If a web page is buried within folders, Google may not reach that page at all. If the search engine crawlers cannot crawl a web page, they cannot index it.

Here is an example of such a page navigation structure.

www.domainname.com/shop/shoes/men-shoes/men-nike-shoes/men-nike-running-shoes/men-nike-zoom-x-shoes.html

Google may drop the crawl if it cannot find a web page after crawling 3-4 folders. Therefore, make sure to structure your website in a way that makes it easier for Google to find all the web pages.

5. Orphan pages

Sometimes, Google cannot find a page because there is no path leading to it.

Google uses links to find different web pages on the site. Think of it as a spider web. One page may lead to another page, and so on and so forth.

Since orphan pages do not have any other pages linking to them, they are easy to miss.

Ideally, there should not be an orphan page on your site. Make sure to create internal links from relevant pages to every single page on your site. Creating this web of internal links from and to every web page on your site makes it easier for Google to crawl each page, pass link equity, and index them.

6. Missing sitemaps

Missing sitemaps may not be a huge issue, but it does make the job more difficult for Google to find, crawl, and index all the pages on your site.

Without a sitemap, Google is flying blind. Sitemaps help Google navigate your website more efficiently and find, crawl, and index more pages.

You can create a sitemap file easily with a plugin such as the Yoast SEO plugin and submit it to Google via Google Search Console.

7. A site that takes forever to load

Google does not want to index and rank websites that take forever to load and provide a poor user experience. Depending on how slow your website is, Google may decide to drop a web page altogether — instead of demoting it on the SERP.

Use a tool like Google PageSpeed Insights or GTMetrix to see how fast your pages load.

Conclusion

There are many things that can go wrong when it comes to indexing, but the reasons mentioned above are the most common ones.

If you’ve been struggling with your local business website because your pages aren’t getting indexed, this checklist is a good point to start.

If you have any questions or need help, give us a call.

 

Related articles

 

Need help with your Local Business SEO?

SEO Company for Small Businesses

 

Local SEO by Location

London SEO Agency

 

UK SEO Agency

Essex SEO Agency | Kent SEO Agency | Sussex SEO Agency | Surrey SEO Agency

Share: