Throughout this case study, you will learn the techniques that were used to increase transactions on an e-commerce website by over 90%. This was done by building a custom, but replicable, strategy for a long-standing client of 2 years.
You will soon learn the technical, onsite, and backlinks approach that allowed this e-commerce client to grow their traffic by 48% year on year.
This traffic growth saw transactions increase by 93.20%, which generated an additional $49k for the year for the client or a 39.45% increase in overall revenue (from $123.6k up to $174.5k).
The relationship with our clients is the cornerstone of The Search Initiative. These relationships are built on SEO expertise, results, and our unwavering energy to deliver great work for our clients.
Where other agencies may slip into autopilot once the client’s initial aims have been achieved, we continue to strive and innovate for our clients to keep them ahead of the curve, even after two years.
The Challenge
The client is a niche-specialist in the confectionery industry, offering high-end chocolates to customers in the USA and around the world.
The kind of chocolate you eat until you explode.
They specialize in both wholesale and retail chocolate sales and want to attract professional clients from the food industry as well as “off the street” buyers. Building relationships, authority, and a brand following are very important in this business.
The client approached The Search Initiative two years ago and was looking to increase conversions, develop a solid link building strategy, and have an in-depth, on-site SEO audit to improve their traffic metrics.
The following is a walk-through of the steps you can take as an e-commerce site manager to achieve similar gains to our favorite chocolate client.
Perform a Technical SEO Audit
Crawl Management
One of the more common issues faced by e-commerce sites is crawl management.
If Google crawls areas of the site that have no use to the bot or users, it can be the result of faceted navigation, query strings, or sometimes, Google’s temperamental flux.
Crawling such pages is a waste of Google’s time, as these pages generally have no use for Google due to them being very similar / duplicates of an original page. Since Google only has a finite amount of time on a website before it bounces off, you want to be able to control that as much as possible.
This means value pages are more likely to be crawled more often and new changes on the site are more likely to be picked up quicker.
What’s even better is: Google tells you what its algorithm “thinks” of your pages!
How? In Search Console -> Coverage report!
One of the areas that are especially worth inspecting with the greatest care is Coverage report > Excluded > Crawled but not indexed.
When reviewing Search Console, you should be looking for URL patterns.
In the “Crawled but not Indexed” section of Google Search on our client’s site, we found many random query strings URLs Google recognized, but wasn’t indexing. These URLs “in Google’s eyes” had no value. After manually reviewing them, we discovered that Google was right.
To prevent the search engine spending more time on these URLs and wasting its crawl budget, the easiest approach is to use robots.txt
The following directives were included in the robots.txt file:
User-agent: *
Disallow: /rss.php*
Disallow: /*?_bc_fsnf=1*
Disallow: /*&_bc_fsnf=1*
This was enough to take care of it!
Please bear in mind that when you are cleaning the index with the use of robots.txt, there will be a part of Search Console > Coverage report which will start going up: Blocked by robots.txt
This is normal.
Just make sure to review the URLs reported there every now and again, ensuring that only the pages you meant to block are coming up.
If you suddenly see a big spike or URLs you did not want blocked, it means either you made a mistake or Googlebot crawled itself somewhere you did not know about.
Index Management
Index management involves removing pages that contribute no value to the user in Google’s Index of your site. Google’s Index of a site is a list of all the pages it could return to the user of a given website in the Google SERPs.
Unlike crawl management, pages that should not be in the index are not always cases where they present no value to Google.
For example, “tag” pages are useful for internally linking articles or products and therefore have value in that they can help Google understand the relationship between pages.
However, at the same time, these pages are not the type of pages you want to see in the SERPs, and by having them indexed, Google will crawl them more regularly
The client had the site set up in such a way that internal search results and tag pages were also being indexed. These provided no value to a user whatsoever, nor would they effectively contribute to better rankings from the search engine’s perspective.
The most common pages that usually mess up index management include:
- Tag Pages
- Thin Content Pages
- Author Pages
- Archives Pages
- Filters
- Sorting
- Faceted navigation (because it can create infinite permutations of URLs)
The tricky part is, you have to identify all URL parameters/types of pages that have no value to the SERPs, and then you can noindex these pages.
As a quick note, it is important to understand that there are cases where index management and crawl management are both under the same umbrella.
For example, Google may be crawling non-value query strings and indexing them at the same time. As a result, this is both an indexation issue and a crawling issue.
Double the fun.
Broken Links
Broken links are a troublesome issue that needs to be resolved if you want a well-oiled website with free-flowing authority across your important pages through PageRank.
It can also result in users missing the opportunity to navigate their way to valued e-commerce pages!
A broken page or a 404 page is, in essence, a page that returns an error due to the URL not existing or no longer existing. It’s commonly caused through old pages being deleted that still have internal links pointing at them from within your site.
The client had 404 errors in abundance, and many internal links were broken; the result of changing their site in the past and not updating the link structure (or doing a proper URL mapping).
To find and resolve these, you need to crawl the website. Any popular crawler like Sitebulb, Ahrefs or Screaming Frog will do the trick. Here’s how you can do it using Sitebulb.
Under Link Explorer > Internal Links > Not Found you can identify where the internal links to these 404 URLs are.
After this, you should go through these URLs one by one and remove the broken links manually.
Where possible, replace the links pointing at non-existent pages, with a link to a relevant, working page. This is particularly beneficial if you are replacing a link from an old, no longer existing, product page, to a new, functioning product page
You may need to fix hundreds of these broken links using this manual technique. All this effort is to ensure no link equity gets lost between the pages you want to rank, especially the money-making pages.
Yes, it’s mundane.
Yes, it’s necessary.
Yes, in most cases, it’s worth it.
Internal 301 Redirects
In addition to finding broken links, crawler tools are also great at picking internal redirects. This causes hops between intermediate URLs instead of going directly to the linking page, which is the optimal route.
It looks quite something like this:
If you follow the Red Arrows:
The link points from the source to a page which is redirected using a 301 HTTP response code, to only then, finally, land on the correct page (returning 200 OK code).
In short: Not Good!
Now, follow Green Arrow:
The link is pointing from the source, directly, to the correct page (200 OK). There is no interim “hop” in a way of the redirected page (301).
In short: Good!
With this, don’t get me wrong. One internal redirect is normally not an issue in itself. Sites change, things get moved somewhere else. It’s natural.
It becomes a problem when a site has many internal redirects – this then starts to impact the crawl budget of the site, because Google is spending less time on core content pages that actually exist (200) and more time trying to navigate a site through a thicket redirected pages (301).
Similar to solving broken links, you have to run a crawl and go through the links identified manually to replace them with the correct, final page URL.
Page Speed and Content Delivery Optimization
I cannot stress enough how important speed optimization is.
In this day and age, it’s a no-negotiation must for a site to be responsive to users. A slow site that takes time to load, in most cases, results in users bouncing off the site, which is not only a loss in traffic but also a loss in potential sales.
And guess what …? Google can also see this!
And when it does, it can lower your rankings thinking the reason for users bouncing is something wrong with your content, or intent, or anything else. It sees the bounce, it doesn’t “see” why users are bouncing.
With BigCommerce, optimizing for speed is very difficult to do. This platform is much more of a closed source software (even more than Shopify!!).
It has very limited options for page speed optimization and there are not many useful plugins to use.
The most simple definition of CDNs is a distribution of servers that mirrors a website across its networks.
The main benefit of this is that it allows a quicker connection time for users who are trying to access the site and are geographically further away from where your server is physically hosted.
In our case, the go-to CDN is Cloudflare, which will allow you to take advantage of a few benefits.
Benefit #1 – Global Delivery Network
BigCommerce’s platform is similar to Shopify in that it requires you to host your website on their server.
This is usually working very quickly for the US users, however, the client’s site did experience small pockets of traffic and revenue from Europe and the Middle East.
Moving onto Cloudflare CDN gets you faster connection time to your site worldwide.
Benefit #2 – Built-in Optimization Tools
Cloudflare has several internal tools that allow for optimizations that are difficult to achieve on the BigCommerce platform.
The biggest of which is Rocket Loader. This is a tool that optimizes Javascript through a lot of magic spells, one of which is delaying JS loading until all key on-page assets like text and images are fully loaded.
Results
The result allowed us to cut the average load time of pages by 34% when comparing to the same time last year.
Structured Data
When you have a client of two years unforeseen challenges are to be expected.
It is having the technical expertise to tackle these challenges that is the key.
Another technical big fix from BigCommerce was the structure and design of structured data within the platform.
Structured data is a segment of unique code designed to give bots like Google additional information about a page. This can include letting Google know that a page is a product/article/recipe page through product/article/recipe structured data respectively.
It is the reason why you occasionally see stars in the SERPs for certain products for example.
The client was using a third-party app Yotpo to consolidate the reviews for their products.
Yotpo generates its own product structured data.
This meant that the product structured data on the actual BigCommerce platform was competing with Yotpo’s and, unfortunately, this meant that neither was accepted as valid by Google.
By both structured data codes being active on the site this prevented the rating “stars” ⭐ from showing up for the product pages in SERPs.
When dealing with any competing structured data snippets, be ready for some manual hard-coding…
When on BigCommerce, you need to cut out the platform’s default review structured data code which can be found under Products > product-view.html in the theme files.
Also, the original structured data used by BigCommerce for products is, out-of-a-box, missing some core features like the image, URL, and SKU number.
Thanks, BigCommerce… 😒
These optimizations will need to be added manually, again.
Depending on what you’re missing, you’ll have to add the code yourself, but always base it on this example from Google:
Breadcrumbs
A more generic structured data fix that was necessary was breadcrumb structured data.
This helps give users a straightforward overview of where the page sits relative to your site structure.
Here’s an example snippet of code:
<script type="application/ld+json"> { "@context": "https://schema.org", "@type": "BreadcrumbList", "itemListElement": [{ "@type": "ListItem", "position": 1, "name": "Books", "item": "https://example.com/books" },{ "@type": "ListItem", "position": 2, "name": "Science Fiction", "item": "https://example.com/books/sciencefiction" },{ "@type": "ListItem", "position": 3, "name": "Award Winners" }] } </script>
Have a Killer Content Strategy
There is an old SEO saying, Content is King.
For e-commerce websites, a common mistake is to only focus on content/keywords that would appear for a product.
While this is important, there are also often large topics that relate to e-commerce niches that commonly go untapped.
It is also important to note that people do not link to commercial content like sales or product pages as frequently compared to informational pieces.
By creating more informational content on the site, you increase the chance of securing a natural backlink.
Create Supporting Content
To accelerate your growth and potential revenue, you should focus your efforts on content that will support your money-making e-commerce pages. These support pages have two functions.
- The main aim is to act as a funnel for drawing in a wider base of potential users by targeting higher volume keywords. Once a user is on the support page the internal navigation and linking needs move the user towards the money-making e-commerce pages.
The solution was to plan and write dessert recipes, adding recipe structured data to help with featured snippet rankability instead of just trying to expand product pages with content.
These recipe keywords have larger search volume compared to the product keywords.
People interested in sourcing the chocolate for these recipes can be funneled into the product pages through internal links from the ingredients section to the specific product pages.
So users could read through the recipe list. If they saw a product they needed for the recipe, they could purchase this from the client there and then.
A big win with this was the site starting to appear in Recipes Rich Results:
This is also a perfect example of building a more linkable site and distributing PageRank.
By internally linking from the recipe pages to the core e-commerce pages, then a portion of the link equity would pass on to core money pages.
You can replicate this same technique by creating an informational resource, like a blog page, that ranks well and then link to the relevant products, passing relevance for rankings, link equity, and paving the way for a transaction.
Furthermore, you should look to create new product lines based on competitors.
Thankfully, the client was flexible and could operationally adapt and make these products driven by the demand and the competitors, casting a net over new audiences and therefore generating more revenue and growth.
Perform Keyword Researching by Discovering Your Content Gap
An additional way of expanding an e-commerce website is by reviewing competitors to see what product lines they are offering that you could offer.
Thankfully, the client was flexible and could operationally adapt and make products driven by the demand and the competitors, casting a net over new audiences and therefore generating more revenue and growth.
To do this you begin by picking out your competitors. This can be done by leveraging your knowledge of the industry or by looking at who is ranking in the top places of the SERPs for valuable keywords. Also, tools like Ahrefs can give you possible competitors based on your site’s and their keywords similarity.
After establishing a broad competitor profile, you use Ahrefs’ content gap tool which allows you to plug in your competitors and it highlights a list of keywords that competitors are targeting that you are not.
The result for us was spotting several keywords that were not originally targeted.
With those golden nuggets, it was easy to identify new products and recipes to quickly target and expand the site.
Content Cannibalization
One of the common issues with working in a focused niche is the potential of keyword cannibalization. This is where two URLs compete to rank for the same keyword.
The main culprit of this was the overuse of the same keyword for similar but different products.
To solve this you need to identify which keywords are cannibalizing – this can be done using tools like Ahrefs’cannibaliztion tool. After identifying the keywords, a judgment call needs to be made on which page should rank for that keyword.
Usually, you have a page that you intend to rank for a keyword. However over time other pages accidentally rank, because you mentioned the same keywords on the page.
For example, your “fudge brownie” page might be competing with your “chocolate fudge” page.
When this happens, you need to deoptimize the unintentional page.
Additionally, when you optimize for a keyword that you want to build around a specific URL, you should also build lots of internal and external links to that page with the relevant keyword anchor text.
This is a sure-fire way to let Google know what page you want to rank and for what keyword.
The client was unaware of this crucial on-page tactic. It was used to their advantage, which is partly why the client saw such a sitewide increase – much of the content was there, it just needed to be tapped into and de-cannibalized (is there even a word like this?).
Over a couple of months, keyword cannibalization began to drop off for affected keywords with one single page ranking for core terms.
This can be seen with the above graph, where the multiple colored lines representing different URLs show how they fluctuated with each other to rank for a single keyword.
However, over some time after cannibalization was fixed, a single URL managed to make its way to the top spot for that keyword – and thus rankings increased.
Leverage a PPC Strategy
A method often overlooked when doing SEO is using PPC data to get a better idea of the organic search landscape.
Since the client was running their own PPC campaign, they created a full export from the past 3 months. This allowed us to review keywords that have a known history of high conversion.
- Start with taking the PPC data and filtering your most profitable keywords by looking at the conversion data.
- Then, take the selected keywords and cross-reference them with organic SERP data to see what is ranking in the top spots. You want to make sure that these keywords also yield similar product pages in the SERPs instead of informational pages.
- Then cross-reference your PPC keywords, with keywords you are already ranking for. From these two things can happen:
- If you find keywords that you are already ranking in organic SERPs, then these become your focus keywords to optimize for due to their higher conversion potential.
- If you find keywords that you are not ranking in organic SERPs, it’s most likely that you don’t have a product page covering this topic convincingly; in which case this presents a new opportunity!
Establish E-A-T (Expertise, Authority, and Trust)
Search engines have a way of measuring a website’s authority, by looking for signals which demonstrate the sites belong to a trustworthy business and/or an expert within their niche.
In SEO, this is commonly referred to as E-A-T or Expertise, Authority, and Trust. It works much in the same you would expect in judging any content source.
For example, if you were to read a legal paper, written by a legal expert from an established & respected law firm, all E-A-T factors would qualify. How?
A legal paper written by a legal expert ticks the box Expertise. The writer of the source is more likely to have a great knowledge of the material he/she is writing about compared to the average person.
A legal paper written by an established law firm ticks the box for Authority. A long-standing law firm is more likely to be heard and recognized in the niche compared to a starting law firm or other sources.
Finally, Trust is established by the source being respected. It’s important here not to conflate trust with authority. It’s possible to have a very big business that has authority in a niche, however, if that said business makes lots of mistakes, then trust will disappear, despite it remaining an authority in its niche.
All the above points are factors Google tries to determine much in the same way you and I might try to determine. Below are three examples detailing how you can establish E-A-T on your site.
1. First, you create a Google My Business (GMB) profile for your site.
Setting it up is quite easy and normally requires a simple confirmation code sent by phone or email.
This is a great way of letting Google know that the website is representative of a physical establishment which brings a level of credibility.
I even recommend creating a GMB and building citations for affiliate websites as a measure to build trust, but that’s another story.
After verifying the profile, the “info” section of GMB should be filled out with your website’s details, to provide as many details as possible, including:
- Type of business,
- Contact details,
- Hours of service
- Etc.
Considering the website is also e-commerce, it’s vital to complete the “products” section in GMB by adding the existing products on the site.
In this particular case, there were not as many products listed on the site with only 15 to 20 products compared to other e-commerce sites with over 100 products. As a result, all products could be manually added to GMB.
In cases where a website is selling numerous products, instead of listing each of them individually, it is recommended to add the “categories” of products under the “products” section of GMB.
I know, it sounds a little pushy, but you can’t add hundreds of products to GMB (say, individual Nike shoe models), so why not say you are selling Nike shoes? Overall, it’s still a product!
Be sure to add as many images as you can, too! In particular, the brand logo and additional images in the “photos” section of GMB. As they form the base of GMB set up.
Brand new listings can lack authority, as they have few reviews. You can address this issue, by quickly reaching out to regular customers and asking them for reviews. This is what the client did and the reviews for his GMB profile started pouring in.
And very quickly this began happening …
2. Implement Structured Data
One of the limiting factors faced with BigCommerce (e-commerce platform used by the client) is the lack of structured data on anything other than the blog.
So if you’re using this, mostly, good platform, be ready for some custom coding!
Adding organizational structured data is crucial in establishing the authority for your website.
If you have a fantastic social media presence, adding structured data will make it crystal clear to Google that your website along with your high profile social accounts are part of the same entity.
Organization structured data markup can be generated and added to the homepage using the editor function.
3. Build Your Own Authority
The client’s website has an author who is responsible for most of the site’s content and is the lead chef writer for most of the recipes on the site. Consequently building E-A-T around the client himself was also a great opportunity.
To help yourself get authority, you should start by implementing the following:
- Internally link back to your author page for all recipe pages (or articles) written by you. You want to pass on the internal link juice. And LINK JUICE = AUTHORITY in the eyes of Google.
- Adding the Author name in all the recipe structured data created. This same principle can be applied to article structured data as there is the same section allowing you to add the author
Search engines equate this sort of information with E-A-T. By building on your existing content supporting who you are, what you do, and where you come from, you can cement a trustworthy online identity to pave the way for the harder SEO work required.
Ensure you are showcasing your expertise in your field by giving your articles an author and internally linking every mention to your dedicated biography.
If you also have a Google My Business (GMB) Account you should also add your relevant services and location data. This is especially important for a business’s citations which are cross-referenced with GMB, a vital source of Google’s image of a website.
Link Building Strategy
A backlink is the simple process of one website linking to another website, e.g. through hypertext. So why is this important and why is it important to expand your link profile?
Link Building is one of the three foundations of SEO along with Content and Technical SEO.
Therefore any SEO strategy needs to incorporate backlink building as part of its overall SEO work.
Before you start link building, you have to analyze the link profile of your site.
Anchor Text
To ensure you have a rich profile, you need to aim for a healthy mix of branded and targeted anchor texts.
You should use a variety of the following types of anchors to further strengthen your profile:
- Exact match: the specific target keyword
- Phrase match: expanded variations of the target keyword
- Brand name: that exactly match the client’s brand name
- Branded: variations of brand keywords
- Keyword branded: a mix of our target keyword with our brand
The above image gives a rough template that through internal research has shown an optimal level across an average of several websites in several industries.
To get a more contextual / industry-specific model, it is better if you average several competitors in your industry who are competing on a similar level or better on Google. To do this, you can plug a list of competitors into Ahrefs, and review their Anchors section.
As can be seen from the example above the competitor has roughly 70% branded anchors, and this is very similar to other competitors. It makes sense, in this case, to have the branded anchors for the client site target this range as well.
With that said it’s time to start getting links!
Leverage Your Network
Google sees links as a vote of confidence by users for the page being linked to. It’s often much easier to connect people who you already know and have a relationship with compared to random people. Links building can work the same way!
In the client’s case, they had an established relationship with people within the industry, including food publications, industry suppliers, industry influencers ( eg recipe bloggers) who were reached out to ask for a link opportunity.
Building Your Brand Offline
One of the new opportunities the client capitalized on was sponsorship links through sponsoring high profile industry events.
The process revolves around you sponsoring an event or large organization (often a nonprofit). In such cases, these large organizations or event hosts tend to have very authoritative websites, and as a thank you, they tend to recognize their sponsors on their website which means a potential link opportunity!
In the client’s case, cooking shows, events, and recognized bodies within the industry were researched and reached out to, however, the same principle can be applied to most industries.
After securing only 4 extremely powerful links from very high authority sources, the clients’ DR shot up from 40 to 51 (additional links have contributed to the continued increase) in a short space of three weeks.
I’ll say this again:
From DR 40 to DR 51 in three (3) weeks!
Outreach
Outreach is the process of reaching out to a fellow website owner/webmaster and asking for a link on their site. The tricky part can be finding the types of websites that are more likely to be cooperative and are happy to link back to you.
In this case, two methods were used.
The logic behind supporting recipe pages on the client site was also applied for link building.
The reason behind this is that recipe guides were the most common long-form content surrounding chocolate. Consequently, recipe bloggers were identified by reviewing the SERPs for dessert recipe keywords. The top-ranking domains were selected and subsequently outreached to.
You can also take this method further, by expanding your outreach pools to sites that are topically adjacent to your niche.
In the client’s case, this moved from outreaching to recipe blogs, to cooking sites, and finally culture and lifestyle websites that cover food.
Let your competitors do the hard work!
One of the quickest ways of identifying websites that are more likely to link to you is by identifying sites that already link to your competitors.
Tools like Ahrefs’ link intersect allows you to quickly identify which domains are linking to competitors that are not linking to you.
Here’s a list of all tools we use for outreach:
Fix External Broken Links (404s)
Brilliant Basics!
As some products go out of stock or are discontinued you should routinely carry out broken backlink audit checks.
This is where you check if any external backlinks are pointing to a 404 page on your site. These backlinks provide no value to your site and are a lost opportunity.
Opportunities like these can be easily identified using the Ahrefs’ broken link checker and it’s a quick and easy way to get a free backlink!
Where links to 404 pages are found, it is recommended to implement 301 redirects to the most suitable, live pages.
Backlink Optimization Results
The techniques mentioned above resulted in a steady but positive uptick in referring domains to the website.
This is roughly the type of link velocity you want to build for your website. The continuous stable growth reflects a website that is growing organically which helps Google identify the site as continuingly relevant therefore helping cement and improve its rankings.
The Results
Christmas and Valentine’s Day are seasonal peaks for the client.
The client took full advantage of this seasonality by increasing their sessions by 50.47%. This generated an additional 72.31% increase in revenue.
According to Ahrefs data, since the start of the account, the overall keywords within the top 3 increased by 59%.
Since the start of the account Referring Domains increased by 53%.
Traffic across the whole domain also saw a very positive uptake.
Compared with the same time last year, organic traffic has seen an increase of 48.61% in sessions and an increase of 51.43% in users.
*Based on the client’s internal data the work TSI carried out saw e-commerce Conversion Rates improve by 30%
Combined with the increase in traffic, the client saw transactions increase by 93.20%, generating a 39.45% increase in overall revenue.
Conclusion
In this case study, you how to technically fix issues related to e-commerce sites and create a content plan for continual growth. You also learned how to establish E-A-T and create a killer backlink plan.
At the start of this case study, I explained how the relationship with our clients is built on SEO expertise, results, and energy to deliver great work.
Exactly two years into this client’s account, we are still approaching the campaign with the same dynamism and enthusiasm as we did on day one.
If you are a business owner or run your own e-commerce site, you must employ all the pillars of a high-end SEO strategy. This will not only improve your search metrics, but it will improve your profits!
If you need some help with this, you know where to find us.
Get a Free Website Consultation from The Search Initiative:
[contact-form-7]
This content was originally published here.