Getting your Shopify site on Google

By Ilana Davis

Let's continue on the indexing topic since this seems to be a hot issue right now.

We all want to appear on Google so that customers find our website and buy from us. But it's not so easy to appear in search results. Google's process is much more complex than if you build it, they will come. So what can we do to help Google along?

Despite what many may think, structured data has no impact on Google indexing your Shopify site.

Structured data helps Google to better understand the content of the page which can speed up their analysis. After installing JSON-LD for SEO, some stores see more pages indexed. This is most likely because Google can understand the page better. It doesn't change the mechanics of how and when Google decides which pages to index via their crawling.

No one can guarantee that Google will index any or all of your pages. Even if you follow every guideline or trick in the book, there's no guarantee. Any app or service provider that promises results is lying to you.

There are three (overly simplified) phases to getting your site in front of your customers using organic search.

  1. Crawling
  2. Indexing
  3. Search Results


Think of crawling as the discovery phase. Google is out there scouring the internet with its bots looking for new pages or websites.

When they crawl your pages, how many pages, and which pages they crawl are all up to their algorithm. We cannot control when or how often they crawl your site. Though you can slow Google down from crawling your site, you cannot make it go any faster.


Your sitemap is like a treasure map telling Google where they can find all your pages. This helps Google to better understand the structure of your website. Shopify automatically creates the sitemap for every store, no other app is needed.

Using Shopify's guide, you can easily submit your sitemap to Google via Google Search Console. Again, no app is required! Shopify support is there to help if you need it.

Robots.txt file

Your robots.txt file will ask Google to ignore certain pages. As an example, you can find your robots.txt file by appending /robots.txt to your website URL. Once there, you'll see Disallow: /admin and many other pages we don't want Google to show in search results.

First few lines of the robots.txt file from a Shopify store. Shows the User-agent and some pages to Disallow.

The pages in your robots.txt file are here for a reason. Although the checkout page is very important to you in making money, they don't provide any value to search queries. By excluding these pages from search results, Google's crawlers can focus on the pages that do matter.

Shopify manages your robots.txt file unless you've modified it by creating a custom robots.txt file.

Discovering pages

Just because Google crawls one of your pages, doesn't mean it's crawled your entire site. Most of the time Google won't crawl your entire site at once.

Let's say you have a blog post that links to another page. The blog post is crawled and indexed but the page is not. The page may show in Search Console as Discovered - not currently indexed. Google did discover it, but they haven't done anything with it YET. In this case, Google may reschedule the crawl of that page to prevent overloading your site with bots.

Google Search Console Page Indexing notice for Disovered - currently not indexed

It's common for pages to come and go in your Google Search Console Indexing Report. So don't be surprised if new and old pages show up as not currently indexed.

Google will recrawl your site when the time is right.


Now that Google has crawled many of your pages, it's time for them to understand what your pages are about. This phase is all about analyzing the content including your meta tags, alt text, images, and more.

Think of indexing as Google deciding whether to include the page as an option in search results. Just because the page was crawled, doesn't mean Google will index it.

Using the URL Inspection Tool in Google Search Console, you can request they index individual pages. Google says this can take up to a week or two, but it's usually quick.

URL Inspection Requestion Indexing from Google Search Console for a page that is not indexed on Google.

You cannot bulk request indexing your website. That's the job of your sitemap which Shopify will continue to update as you add new pages to your site.

If Google doesn't index the page, you'll need to figure out why. In some cases, it's ok for Google not to show the page in search results. For example, Shopify sites may have variant URLs crawled as well as the canonical product URL. We don't want to show both the variant URL and the product URL (duplicate content and whatnot). So Google tells you they've decided not to index the variant URL.

Google is doing you a favor by preventing your variant URL from showing up in search results. If both your variant URL and product URL appeared in the results, they'd be fighting each other for ranking. You wouldn't want to compete against yourself.

As much as I prefer to use Google Search Console, they often cause a lot of stress with indexing issues when there isn't an issue. The trick with Search Console is that you have to read the details. Most indexing issues can be ignored.

If you're a new store (let's say less than a year old), indexing can take time. You cannot speed up the process nor can you influence Google in any way to index more pages.

If a page should be indexed but it's not, review Google's troubleshooting missing page tips.

Search Results

Finally, we get to what your customers see in search results. When customers enter a query, Google has to decide which pages best answer the question.

At this phase, Google is looking for relevant and high-quality content. There are hundreds of factors that go into which result is shown and where they fall in search rankings.

Google evaluates each query to determine the most appropriate type of result to show based on their algorithm. Essentially, they are trying to determine the searcher's intent based on the query. If the query is "best recipe for poutine" then the results will most likely show recipes and videos rather than a local restaurant. Similarly, a query for "battery backups" will likely result in products, shopping ads, and blog posts about the topic.

Google then uses your technical SEO bits to understand the page. This could include your structured data, meta tags, and HTML structure. It's here that Google decides if those technical bits are used to influence how the result is shown in search results.


Get more organic search traffic from Google without having to fight for better rankings by utilizing search enhancements called Rich Results.

Linking Llama

Link discontinued products to their best substitute. Keep discontinued products published on your website and continue to benefit from traffic to these pages.