TL;DR: Crawling, rendering, and indexing are vital for SEO success. To lay the foundation, you need to:

To increase your knowledge and become your own technical SEO consultant, you will soon need to think about crawling, rendering, and indexing. 

We outline these terms in our terminology explainer post but it’s worth taking the time to talk about each here. They are the cornerstones of technical SEO and if they are not properly implemented or carried out, it will be impossible for you to rank. 

photo of a woman looking at a laptop cloned three times.

What’s the difference between crawling, rendering and indexing?

Let’s begin in order. 


Crawling is how search engines, like Google, read your website using bots called spiders or crawlers. There are a couple of reasons why a page may be crawled: 

  • You have an associated XML sitemap submitted.
  • There are internal links pointing to the page.
  • There are external links pointing to the page.
  • The page is experiencing a spike in traffic.

Your current technical SEO consultant may have told you that to appear in search results, Google must crawl your website. While this is true, crawling on its own doesn’t mean you will show up in results. It simply means that Google is looking at your page to assess its value. If the bot determines that the page it is crawling has something valuable to add, it may schedule the page to be indexed.


If your website has been crawled, that means the search engine bot has found you, but it still needs more information to understand the quality and value of your website. Every website can exist in two states, the initial HTML and the rendered HTML—and the two states can be very different from each other.

When rendering, Google runs the code and assesses the content to understand how your site is laid out. What’s more the process will also help the search engine to determine how long the page takes to load, what type of content exists, what links are present, and similar factors that authenticate the site’s value.

Once the quality of your website has been determined through rendering, Google or the other search engines can then index it, which is essential to begin ranking.


Finally, we get to indexing, which is a search engine’s database of all the websites they will show, depending on the keyword someone searches for. To put it simply, if your website isn’t indexed, people will not be able to find your business. 

upward graph concept.

Improving your crawling, rendering, and indexing

So far, we’ve talked about crawling, rendering, and indexing as three separate concepts, which they indeed are. That said, they are each closely related and any steps you take to improve one will have an impact on the others. 

In a previous post, we spoke about site structure and navigation. Optimizing your site architecture is an essential step in achieving technical SEO results. Review the post above and ensure you are following best practices including a flat structure, good internal linking, breadcrumbs navigation, etc.

Submit an XML Sitemap

Sitemaps are blueprints of your website and are essential for search engines to be able to find, crawl, and index your site. They also let search engines know what pages are more important than others, which avoids self-competition. 

There are different types of sitemaps:

  • XML Sitemap: The most common, which links to the pages on your website.
  • Video Sitemap: Specifically for video content.
  • News Sitemap: If you want to be approved for Google News.
  • Image Sitemap: Helps Google find any images hosted on your website.

For this post, we’ll focus on a normal XML sitemap. 

render of "Send" key highlighted on a computer keyboard.

How to submit your sitemap to Google

First, you need a sitemap. If using WordPress, we highly recommend (and use) SEOPress. Once installed, be sure to enable the HTML sitemap

Next, you will want to have your website registered on Google Search Console

Once you’re all set up, simply follow the instructions for creating and submitting your sitemap as outlined by Google.

Expand your crawl budget

Once you’ve successfully submitted your sitemap, Google should soon crawl your website. But not all crawling is created equal. 

Google assigns different amounts of crawl budget depending on how important your website is. If you have too many redirects and dead ends, the crawl may end before Google gets to your important pages. 

While the crawl budget isn’t a ranking factor in itself, it does influence how frequently your site’s pages are crawled. 

To amplify your crawl budget:

  • Remove resources from your sitemap that don’t need to be crawled or indexed. For WordPress specifically, this could include specific custom post types or taxonomies (how common posts are grouped together), author archive pages, etc. If you’re using an SEO plugin, be sure to review the settings for the XML Sitemap.
  • Ignore resources that are low priority by editing your robots.txt file to disallow individual resources. 

As an example, by adding the following to the robots.txt file, you’ll inform bots to not access the filename.jpg file within the images folder:

User-agent: *
Disallow: /images/filename.jpg

You can also do this for certain file types. In this example, we’re instructing bots to ignore all pdf documents:

User-agent: *
Disallow: /*.pdf$

Check your pages are indexed

As the web increases in size and becomes more complex, it’s harder than ever to ensure your website is properly indexed. A 30-day study by Botify, which looked at 413 million unique web pages and 6.2 billion Googlebot requests, showed that: 

  • 51% of the pages weren’t being crawled by Googlebot
  • 37% weren’t indexable
  • 77% weren’t getting any organic search traffic

If you’ve submitted your sitemap, you’re probably already being indexed by Google to a certain extent. But there may still be room for improvement. 

To check, first carry out a search and look for your website among the results. If you see it there, that’s a great start! 

But this will only give you very basic information. For more in-depth insights on how much of your website is indexed (ideally all the pages you have), go to Google Search Console. On the Coverage report tab, you can get an exact number of the pages that are indexed as well as pages with errors or that are not being indexed at all. 

Pay close attention to pages with “Errors” or those that are ““Valid with warnings” and take action to fix them. 

If you would like to check if a specific page is indexed, use the URL Inspection Tool.

Index unlisted pages

To index any pages that aren’t currently listed:

  1. Go to Google Search Console.
  2. Click on the URL inspection tool (see above).
  3. Paste the URL in question into the search bar.
  4. Once it pops up, click “Request indexing.”

If you run into an issue or need further information on indexing, check out indexing on Google’s blog. To make extra sure you’re being indexed properly, the Screaming Frog crawler is great at finding issues that didn’t show up in the Coverage Report provided by Google. 

What about rendering?

While the difference between crawling and indexing is cut and dry, rendering and indexing are more closely related. The good news is, if your pages are indexed, they’re more than likely being rendered. However, if you want to be sure, use the Google Search Console URL Inspection Tool and click “view crawled page.” You’ll see the HTML that has been rendered.

If you find there’s an issue with rendering, we’d recommend consulting an expert because rendering is one of the most technical aspects of SEO.

In the meantime, to find out more about the subject, we’d recommend watching Google’s Martin Splitt’s webinar on rendering, which deals with Google’s own findings with regard to indexing

Finally, bear in mind that the main problems with indexing and rendering happen with particularly big sites that have a large number of pages. For most small-to-medium businesses, following the simple best practices for crawling and indexing should empower you to make your website visible within Google search results. 

Once your website is being successfully crawled, rendered, and indexed, it’s time to start thinking about the content you have on your website. Check out our next post in the series for more information about reducing low value and duplicate content

If you would like to take control of your technical SEO without the expense of hiring an expert, our DIY SEO Management Software puts you in the driver’s seat. Any questions, feel free to connect with us!

About The Author
Justin Korn

Justin is the founder of Watchdog Studio, and former Director of IT at both Wells Fargo Securities and AirTreks. A prodigy of the dotcom era, he now provides businesses in Oakland, California and the surrounding Bay Area with honest, expert website services to drive growth.