Shopify SEO warning: your JavaScript might be costing you sales

Published: December 09, 2025

By Ilana Davis

A customer recently asked me about using JavaScript heavily on their store.

We're considering dynamic rendering as a way to improve our site's SEO. How feasible is this for a Shopify store like ours that heavily depends on JavaScript for user experience, and what impact might this have on our existing JSON-LD structured data?

Before we get into how JavaScript impacts structured data, we need to talk about how JavaScript impacts your store's website as a whole.

The short answer, every little bit of JavaScript your store uses can cause harm to sales, speed performance, and accessibility.

Rule of thumb: you want to use as little JavaScript as possible.

On that bombshell, let's get into the details.

JavaScript is slow code

In case you've never had it explained, JavaScript is a programming language that allows code to run on a website.

Nearly everything else on a website is just a file that the server sends to the visitor. HTML, CSS, fonts, images, etc are just files that your browser is extremely fast at downloading and showing.

JavaScript though has to be processed and evaluated after it's downloaded. It might only take a few seconds but when computers can process things in 1/1000th of a second, even a single second can be an eternity. Most of the slowness and sluggy feeling on Shopify stores are because of the amount of JavaScript they are running.

(Yes, you in the back, you're correct that CSS and HTML are also processed but they process and paint so fast that they don't matter)

JavaScript should be optional

Due to JavaScript's slow speed, flaky compatibility, and bugs, most websites are developed with the idea that JavaScript is optional. That means building features without JavaScript and only using JavaScript to enhance those features.

For example, start by making the add to cart button send the customer to the cart page with their added product showing in the cart. Then layer on top of that an optional JavaScript feature to have a mini-cart appear on the product page showing the same thing.

This concept, called progressive enhancement, was also great when errors occurred. If that JavaScript code had an error, or a screen-reader was used by a blind customer, or if the Internet was "having an off-day at AWS" then the regular add to cart would still work without JavaScript. The order would still go through, no big outage.

More recent Shopify themes, apps, and even Shopify themselves have mostly abandoned this process in favor of JavaScript-all-the-time. That's pretty much the root cause of so many errors, crashes, and additional costs.

Requiring JavaScript

Things get even worse with JavaScript doing client-side rendering. This is where JavaScript code is creating all parts of your page for you; your content, headers, links, etc. While this sounds fine, that means your customer has to wait for all of that code to run before they see anything on the page. If a bug happens to sneak in or something just doesn't work correctly, your entire site may be a blank page.

To add even more injury, JavaScript isn't a single thing. There are multiple versions of browsers that visit your store, many running different JavaScript Engines (and even multiple versions). What works in the latest Google Chrome might not work in the latest Safari or Firefox. What runs fine in the desktop browsers might crash mobile browsers. Even if you get all regular browsers working, you might be locking out bots used by Google and OpenAI or even humans using accessibility tools.

This is why there are teams of developers who have to maintain JavaScript code. It's morphed into an entire software development process.

Locking out bots

Let's dig into problem a JavaScript page poses for bots.

For the majority of the life of the Internet, web-crawling bots used by Google couldn't even see JavaScript. That meant if you added anything via JavaScript, it was invisible to Google. (And other search engines were even further behind)

In recent years Google's bot can see and process JavaScript but it's limited. They limit how long the bot will wait for code to run, they limit which subset of JavaScript they'll run, and they won't interact with the page at all.

That means if you use a lot of JavaScript or require the user to click or scroll, odds are Google's bot won't see any of that.

The net result is your page won't get in Google or will be missing information.

With the rise of LLMs and AIs crawling pages, that also means AIs won't know either.

If that data happened to have your structured data or something else vital, you're now missing out on whatever benefits those would have gotten for you.

Dynamic rendering

Dynamic rendering was a special thing that was supposed to help with bots to handle JavaScript.

You'd still have JavaScript (and all the problems with it) but you'd have a whole new set of tools running for search engine bots. Those would end up creating regular webpages (HTML).

You can think of it as getting to the same (or close) as a non-JavaScript page but taking multiple steps. The big problem was that it was never identical to the regular page and caused even more problems. If you remember having to maintain a regular Shopify theme and a mobile Shopify theme (and maybe an AMP Shopify app), you'd be familiar with how much trouble it can be.

Even Google themselves don't recommend Dynamic Rendering at all. In the big scary red box they say:

Dynamic rendering was a workaround and not a long-term solution for problems with JavaScript-generated content in search engines. Instead, we recommend that you use server-side rendering, static rendering, or hydration as a solution.

JavaScript hurts site performance

Let's say you have an amazing team and none of those are problems for you. They created code that handles all errors, works in every browser, and never locks out bots.

You're good right?

Nope.

JavaScript is the biggest determent to site performance. Sometimes people will make a mistake like using an image 10x the size needed or loading 100 images on a page. But beyond those accidents, JavaScript is the cause of the vast majority of performance problems.

JavaScript files might not seem like a problem because of tools that shrink their size. Yet due to the number of files needed and how those files are loaded, performance will take a hit.

When an image is loading for a page, the browser downloads it and then creates an area on the page to draw it in. This is called painting and is what LCP is measuring (Largest Contentful Paint). Once the image is painted, the browser is done with the image.

When a JavaScript file is loaded, it has to download the file, parse the file, compile the file, pause the rest of the page processing, and execute the file's contents. Depending on the code, it can be executing for a looooong time too. Now consider if you have a couple dozen JavaScript files that need to do that.

I myself have 31 different JavaScript files running and I run a very lightweight Shopify store for my website.

Optimizing a large image is easy (compression, resizing, reformatting).

Optimizing the same size of JavaScript can be a multi-month task for a full team, and still might not result in anything.

For comparison, it might cost $20 for a designer to hand optimize a large image to get it to load faster. You could easily spend $10,000+ for a JavaScript developer to tune some JavaScript.

JavaScript and its impact on structured data

As we've shared above, JavaScript is usually the primary cause for poor page performance. I've also written previously about large page sizes and how they can impact structured data, so I don't want to get into this too much.

One question that doesn't get asked enough, is using JavaScript to inject structured data on the page good for SEO?

Yes, but it has limitations and could be harmful for your SEO.

The issue, as Google puts it, is that if your product data changes frequently, JavaScript is not as reliable.

Using Product markup? Be aware that dynamically-generated [JavaScript generated] markup can make Shopping crawls less frequent and less reliable, which can be an issue for fast-changing content like product availability and price. If you're a merchant optimizing for all types of shopping results, make sure your server has enough computing resources to handle increased traffic from Google.

As we've talked about above, Google can see the JavaScript data so that's good. However, we don't know if the app generating the structured data from JavaScript has enough time and power to create that JavaScript before Google's bot stops them and moves on to the next page. This is the exact same issue as if your page is gigantic and doesn't load for Google's crawlers.

That's not a reliable method of ensuring Google can see your structured data if it takes too long for the content to load.

That's why JSON-LD for SEO doesn't use any JavaScript (with the exception of Magical Review integrations which doesn't apply to most stores).

While JSON-LD for SEO uses JSON-LD which looks and sounds like JavaScript, it is something completely different. As we learned in Finding Nemo, a boat and a butt sound the same, but we both know they are completely different.

Treat JavaScript like medicine with side effects

While this has been about the problems with JavaScript, there are benefits and uses. Some things can only be done in JavaScript or are easier in JavaScript.

The problem is that everyone these days reaches for it to solve problems it has no business solving. It might look cool but that coolness ends up hurting your website (and thus your ecommerce sales).

Think of JavaScript as a medicine with side effects. You only want to use it when something is sick and when you do, make sure to counter the side effects and get off it once you're healthy.

In other words, you can't avoid JavaScript so use it with caution and in moderation.

That's the real path to getting better Page Speed scores and faster stores.

JSON-LD for SEO

Get more organic search traffic from Google without having to fight for better rankings by utilizing search enhancements called Rich Results.