The Philippine ecommerce scene is a vast, untapped oil field of opportunities. The ones who can set up shop and do it the right way today will be the ecommerce tycoons of tomorrow.That’s what we learned first hand when we took on an ecommerce SEO campaign with a local retailer of cosmetic products. We initially had our doubts on the readiness of the Philippine market for niche ecommerce stores, but we were completely blown away with how things turned out.
In this case study, we’ll discuss how we used fundamental SEO tactics to grow the client’s organic traffic from 4,500 to 85,000 in just over six months.
What is Ecommerse SEO?
Ecommerce SEO is the process of optimizing online stores for greater visibility in search engine result pages (SERPs). Unlike SEO for typical sites, this type of optimization os more focused on a site’s technical health, crawlability, content uniqueness and long tail keywords. Link building is still an essential part of this sub-type of SEO, but it’s not quite as central as it would be for non-ecommerce sites.
Project Background
At the time we on-boarded it, the site was about eight months old and it had a decent product lineup. It also ran an online beauty magazine with some good content, but SEO was never really applied to maximize online visibility. This is how its organic traffic looked from October 2014 to the end of May 2015 prior to us stepping in:
As you can see, growth was pretty much non-existent. This is what the organic traffic looked like after we worked on it for several months (June 2015-February 2016):
Initially, we had doubts about the results and we considered the possibility that the organic traffic might have grown as the months moved closer to December when online shopping peaks. However, the organic traffic continued growing even after the holidays were over, solidifying our conviction that this couldn’t be a fluke.
If you combine that with the fact that year-on-year organic traffic differentials were simply too far apart, we can safely conclude that seasonality had nothing to do with the traffic surge. This was SEO at work, pure and simple.
The Ecommerce Optimization Process
Over the years, I’ve noticed that even seasoned SEOs can struggle with ecommerce optimization. Whether it’s the size, the complex structures or the lack of in-site linkable assets in ecommerce properties that give SEOs fits, I can see why people would think of projects like these as a big challenges.
In GDI, the cornerstone of our success with ecommerce sites is a simplified optimization process that addresses every facet of Google’s ranking signal buckets.
As seen in the diagram above, ecommerce sites can be effectively optimized using the same core SEO principles that most of us apply to any other website. The facets of technical health, content, authority signals and usage data are all essential to any campaign. The list below shows the SEO techniques we applied in the order that we rendered them.
Please note that we did not apply every tactic we knew due to limited client budget. We chose to stick with the most important optimization tasks so we can give the client the biggest bang for their buck:
I. Technical SEO Audit
Technical SEO is all about making sure that all public-facing pages on your site can be found, crawled and indexed by search engines. In that regard, these are the things we emphasize when doing a technical SEO audit:
- Page availability – A site’s ability to keep its pages up and running.
- Internal linking – Your site’s ability to facilitate bot movement from its home page to page.
- Page load times – The overall speed in which your pages load fully.
For most technical SEO audits, we rely on free tools that are easy to use. That said, Google Search Console and the Screaming Frog SEO Spider are all we needed for this project.
Here are the steps we took when we did the technical audit for the ecommerce client:
Robots.txt Validation
The robots.txt file is a simple document that tells search engines which pages can be crawled and which ones are off limits. It’s common practice to disallow bot crawls while a site is under development. However, there are cases when the site launces but devs forget to edit this file so they can let the bots in again.
Much like checking if the TV is plugged before you try to watch anything on it, checking whether the robots.txt file is properly configured makes sense before you try all sorts of indexing diagnostics. Make sure that search engine spiders are allowed in while back end and dynamic URLs are kept off limits.
This is the first thing we checked in this client’s case. Notice how we put the asterisk sign beside User-Agent to denote that we are allowing all bots to access the public-facing pages. Conversely, notice how we disallowed crawls on pages that users will encounter during the checkout process.
Making sure your robots.txt file is tight not only ensures that non-public facing pages stay off search engine indices, it also allows you to stay efficient with your crawl budget. You see, Google doesn’t necessarily crawl every page on your site each time it visits. The number of pages that it reads and considers for inclusion on its index is a function of page volume and your site’s overall authority. By being judicious with the number of pages that you allow Google to crawl, you increase your chances of getting all the pages that you want ranked to be visited.
This isn’t a post about crawl budget theories, so I’ll leave it to you to do more research on it. I recommend reading this post by Ian Lurie in Search Engine Land. I had the pleasure of meeting him last September as a co-speaker at Confluence and he has lots of great insights on the matter.
2. Meta Directives Audit
Your robots.txt file isn’t the only way to restrict page indexing in your site. Meta tags such as noindex and noarchive can also be used to exclude pages from the SERPs for whatever reason. To make sure that all the pages we want indexed are representing us for our target keywords, we performed an audit of every page in the ecommerce site using Screaming Frog.
You can either set the tool to Spider mode and let it crawl the site or you can set it to List mode and let it analyze a specific set of URLs. For our part, we usually just extract the URLs from the XML sitemap and let Screaming Frog analyze those. Doing a full crawl usually takes more time and Screaming Frog doesn’t discriminate from public and non-public facing pages. As long as they’re internally linked from other pages, the tool will find and include them.
Once the tool is done with its analysis, you can export the results to a CSV file. You can then check the Meta Directives column and see if the pages you want indexed have meta directive tags that might be blocking bots.
The screenshot above shows what the report would look like (I used my site here as the example). In this campaign, we made sure that all the product, category and article pages did not have noindex and noarchive tags.
Sitemap Cleanup
Google and other search engines use XML sitemaps to better understand your site’s structure and to crawl it more intelligently. By putting a URL in the XML sitemap, you are basically telling Google that it’s an important webpage that should strongly be considered for indexing.
However, having a URL in the sitemap doesn’t always mean that Google will include them in its main index. As you can see in the Search Console screenshot below, the client only had about 62% of the pages in its XML sitemap indexed when we started.
Ideally, a site should have a 100% indexing ratio. There are a few reasons why a page listed in the XML sitemap might be ignored by Google. These include:
- Bot restrictions – Having a URL in the XML sitemap but blocking bots with robots.txt or the noindex meta tag makes no sense. If you’re blocking some pages intentionally and with good reason, take them off this document.
- Thin content – Google views content as being “thin” when it doesn’t have enough unique merit to justify its existence. Doorway pages, affiliate programs and scraped content are usually filtered out of the SERPs. Matt Cutts shed some light on what Google considers thin content in this video.. If you have content that search engines might consider thin, you can either de-index them and take them off your sitemap or try to beef them up.
- Duplicate content – Internal duplication or close resemblances between two pages can cause Google to filter some of them out. This is particularly tricky among product pages in ecommerce sites where some SKUs differ only in one aspect such as sizes and colors, In cases like these, the copy is almost identical and may be mistaken by search engines as cases of absolute duplication. In these instancess, we usually choose just one among all of the similar product pages to represent the site on the SERPs and point the rest of the pages to it using canonical tags.
4. Crawl Error Fixes
Google Search console has a handy crawl error report that shows you lists of pages that Google tried to read unsuccessfully. Ecommerce sites are particularly prone to these, and having too many crawl errors can harm your overall search visibility. There are three main types of crawl errors. Here’s a description of each one and advice on how to address them:
- Server Errors – These are reported when a page times out or access is being refused but the server to bots. These can be addressed by fixing the technical root causes with the help of a developer.
- Soft 404s – These are reported when a webpage does not exist and your site displays a Not Found message, yet the server response is 200 OK instead of 404 Not Found. These can be fixed by giving the URLs a hard 404 Not Found status which Google will interpret as a final deletion of a page. Alternatively, you can use 301 redirects to point users and bots to pages that have replaced them or are closely related to them in context.
- 404 Not Found – These are reported when a page that was once successfully crawled by Google no longer exists. Action may not be necessary as Google will “forget” a page after a few months of non-existence. However, if a new page in a different URL has either replaced or is very closely related to the page that Google reports as Not Found, it could be a good idea to 301 redirect the old URL to the new one.
In the case of our client, we found a few hundred crawl errors, mostly soft 404s as seen in the image below. We had them convert the server response to legitimate 404 Not Founds for the most part. The remaining soft 404s and real 404 errors were 301 redirected to appropriate target pages.
At the end of the process, we marked the errors as fixed on Search Console. We have since performed monthly checks to make sure the issues are dealt with promptly in cases where they recur.
5. Site Speed Check
Site speed has become a very important technical SEO factor. As Gary Illyes of the Google Search Console team repeatedly stated in SEO conferences, the search giant is obsessed with page load times. Studies have also shown that good page load times are correlated with high degrees of search visibility. On that note, we treat the matter of speed very seriously and the first step to helping our clients speed up their sites is to show them how it currently performs in Google’s view.
While a lot of SEOs use the Pingdon page speed test tool, we also like using Google Page Speed Insights. This tool provides a nice summary of stats that gauges load times, It also gives you recommendations on how to boost performance with tweaks to your site.
I have to admit that the site is below average in this facet and it continues to be that way nearly a year after we started SEO. As the search marketing team, GDI can only provide advice on the value of improving site speed and how it can possibly be done. The client and its development team are ultimately in charge of implementing these enhancements. We can only hope that this becomes a priority at some point in the engagement.
II. On-Page SEO
On-page SEO was the most critical part of this project. Giving the site a better structure and enhancing the content on each page allowed us to show Google the relationship between what was on the site’s pages and our target keywords.
We didn’t do anything fancy. The process is very similar to what a lot of SEOs here in the Philippines use. To be perfectly honest, it wasn’t hard work – there was just a lot to be done. Here’s how we did it, step by step:
Site Structure Improvement
Establishing a logical site structure that silos your content into interrelated topics is an underrated aspect of on-page SEO. Proper categorization and internal linking not only allow bots to crawl deeper, it also facilitates better flow of internal link equity to your inner pages.
When the site was handed over to us, one of the first things we noticed was the lack of proper categorization. This store grouped its beauty products under their respective manufacturer brands and the body parts where they can be applied, but it didn’t go beyond those basic categories as seen in the screenshot below:
I’m no beauty expert, but even I would realize that cosmetics and personal care products can be grouped into different classes in a more specific way. Facial care products can be grouped into BB creams, moisturizers and toners. Makeup can be grouped into mascaras, foundations and lipsticks. Following that logic, we asked the client to group all the products into categories and subcategories, which allowed us to create pages for each one.
This is a crucial step because ecommerce sites can better target fat head and chunky middle keywords with category and subcategory pages. Long tails are usually better targeted using very specific product pages. At the end of the process, we were able to create new category pages which were featured in the site’s main navigation menu:
Having links to category and subcategory pages from the main menu is critical due to the fact that it lets bots discover these pages right from the home page. Also, keep in mind that the highest concentration of link equity for most sites is in the home page. Linking to category pages from there allows link equity to flow straight to them, giving them a much better chance to rank well for their target keywords.
2. Page Copy Optimization
Staying on the subject of category pages, I want to let you in on a little secret: a lot of ecommerce sites fail in SEO because they neglect the optimization of their category pages. A lot of site owners assume that they can just create these pages and use them to drive people to product pages with linked images and text, but that’s not enough in most cases. Here’s an example:
That’s all well and good if your site has massive domain authority which can power its pages to the top of the rankings by that virtue alone. For a site with a domain authority of 22 like our client, that’s not nearly good enough to gain prominent search engine placement.
To offset our low domain authority, we banked on using richer page copy to help Google understand what our pages are about. A lot of smart ecommerce sites use this tactic. In Lazard Philippines, for instance, you can see copy near the bottom of category pages that seem to have been placed there for SEO reasons:
For our part, we wanted to be a little more aggressive with this tactic. It’s generally accepted in SEO circles that Google gives more weight to text that’s visible above the fold (immediately seen without scrolling down), Knowing that, we decided to position a short text blurb right before the product selection starts in each category page’s body.
You can also see that we added rich text copy after the product selection ends:
I would credit this technique with about 70% of the success we’ve had with this project. Sure, it took a lot of time to get all the work done. At the end of the day, it resulted in a happy client who started with very little search visibility to being a premier player in the online cosmetics market locally.
- Writing Title Tags with Intent
The title tag is still one of the most important on-page ranking factors. While most of us know how to edit these on a CMS, a lot of ecommerce sites tend to write them without considering one very important element: searcher intent.
Google has become smart enough to realize that different users may enter the same words in its search box, but they may be looking for different things. Entering a query for the term “night cream” shows an ambiguous intent and doesn’t really signify whether the searcher’s wants information on night cream or is looking for an online store to buy it from.
When we write title tags for ecommerce sites, we make sure to be very clear with the intent that our pages cater to. Our client wants to sell beauty products and the title tag has to reflect that. Check out this example where we rank #1:
So instead of writing something like “Night Cream Philippines – vendor name>” we go with something like “Buy Night Cream Philippines – <vendor name>” where “buy” is the word that signifies intent. We recognize that it’s not just easier to rank for the longer keyword phrase — it also ensures that we receive the kind of traffic that matches our business objectives. This keeps away non-targeted visitors and significantly improves overall bounce rates from organic traffic.
Title Tag Audit
Staying on the topic of title tags, you’ll want to make sure that every public-facing page has a unique and accurate one. For this project, we relied on Screaming Frog to scan all the pages listed in the XML sitemap. This is what we considered when we reviewed every title tag on every page:
- Presence of the title tag
- Length
- Accuracy in relation to page’s content
- Intent
- Uniqueness
Alternatively, you can also use Google Search Console to see which pages have deficient titles. This report is under Search Appearance > HTML Improvements.
You can download the reports as CSV for easier data management. This report also contains information on deficient meta descriptions which you should also optimize.
Granted, meta descriptions are not direct ranking factors. However, writing really good ones can increase the click-through rates on your search engine listings and positively impact your overall visibility.
You can download the reports as CSV for easier data management. This report also contains information on deficient meta descriptions which you should also optimize.
Granted, meta descriptions are not direct ranking factors. However, writing really good ones can increase the click-through rates on your search engine listings and positively impact your overall visibility.
Duplicate Content Audit
As mentioned earlier, ecommerce sites are susceptible to cases where Google mistakes similar product pages as being identical. You can get clues on which of your pages are being perceived by search engines this way from the HTML Improvements report in Search Console.
When we worked on this site, we checked how many Duplicate Title Tags were being reported. We then examined each case and manually checked the content of the pages. In cases where we found pages to be very similar in content or even identical, we asked ourselves whether the duplicates had enough merit/justification to warrant their existence. If the answer was no, we would consider deletion or de-indexing them.
However, most of the exact or near-duplicate pages we found were from pages that had to exist. In order to keep them live without harming SEO, we used canonical tags to point bots to the “real” page that we wanted to represent the site on the SERPs. For example, if we sold the same lipstick with the same description that only differs in color shades, we used a canonical tag to tell Google to index and rank just the most popular among all the product variants.
III. Outreach and Link Acquisition
To be honest, we didn’t do a lot of ecommerce link acquisition for this client due to two major reasons:
- Budget – this campaign was running on a less than monthly retainer of less than P40,000.
- Necessity – We didn’t feel like we needed a big link push to deliver good results.
The strategy was basically to reach out to local beauty/fashion bloggers and ask them for reviews with links back to us. We qualified outreach prospects using the following factors:
- Topical relevance – Partner sites had to be about topics that are closely related to our client’s merchandise.
- Local audience – The content must be targeted towards a Filipino audience.
- Legitimate readership – The readership not on have to be substantial in number – it also has to have a community that comments actively on the content.
- Good social sharing signals – Likes, Tweets and G+1 numbers have to show activity. We give plus points if the blogger’s articles get shared on social media by real people.
We contacted the bloggers and pitched them an arrangement with the following conditions:
- The blogger would publish an article that reviews the shopping experience that they’ve had on our client’s site.
- We give them cash vouchers usable within the online store so they can “buy” products of their choice at no expense to them.
- The blogger just had to credit us with a link. We prefer dofollow links, of course. However, we respect the policies of our partner sites and we would still appreciate nofollow links.
- We didn’t enforce any word count or keyword focus requirements. We asked the bloggers to write as they would any of their regular posts.
In the end, we got a good number of link placements and positive brand mentions out of the engagements we did. Here are a few samples:
IV. Results
As mentioned earlier, the client has enjoyed a meteoric rise in organic search traffic from one to this month.
More importantly, that organic traffic rise has translated into conversions. Since we started, the site has generated 121,776 conversions from organic search – far beyond what it was doing prior to optimization. Things actually got so good that the client struggled to keep itself stocked with products during the holiday season.
Learn More SEO at Pepcon 2.0
If you want to learn even more effective and actionable SEO strategies, attend Pepco 2.0 on May 21st at the AFP Theater in Quezon City, Philippines. Some of the world’s best SEO specialists are speaking. Mike King, Dan Petrovic, Jon Cooper, Jason Acidre and yours truly will share their knowledge to help you get that extra edge in your SEO campaigns.
Meet Our Generous Sponsors!