SEO for New E-commerce Businesses – by Digitalis

“The battle for organic search performance can be won or lost before a site launches. Choices made in the design and implementation stages impact the ability of an ecommerce business to change critical elements for search engine optimization, handcuffing what is typically one of the largest drivers of traffic and revenue”, explains Peter Eigner, Digitalis Managing Director – Europe, further adding:

“E-commerce platforms require a special breed of SEO. It must capitalize on the benefits of scalable architecture, send consistent signals for search engine algorithms, and be easy to optimise”.

The following tips will help an ecommerce site start off strong, or help prepare it for the next redesign or migration.

Taxonomy, Crawlability

E-commerce SEO relies on the strength of the catalog for a hierarchical taxonomy of categories and facets. Each of those represents a page that must fight for organic search rankings. The collective success of those pages determines how much revenue your ecommerce site will drive through organic search.

Organic traffic hinges on the existence of relevant pages to optimize. Ensure that your taxonomy allows for a unique landing page for each critical keyword theme for which you need to rank.

“Ensure that your taxonomy allows for a unique landing page for each critical keyword theme for which you need to rank.”

For example, if you sell trainers and your taxonomy is organized by gender at the highest category level, create an actual page for “women’s trainers” instead of having the click in the navigation open a menu to display subtypes of women’s trainers. Without that page, you cannot target a valuable keyword.

Also, make sure that any product category with traffic-generating keyword themes has its own page instead of being joined with another category. For example, if you need to drive revenue for both shirts and T-shirts, don’t lump them into the same category of “shirts and t-shirts” — or worse, “tops.”

Just as damaging, though, can be multiple exposures of subcategories and facets, as it creates duplicate content. If you’re planning to place the same subcategory under multiple subcategories, be certain that they all link to the same URL. For example, if “cycling helmets” can be found under the categories of “cycling” and “safety gear,” make sure that both link to the same page.

The same applies to representing the same label as both a category and a facet. This is a common pitfall with gender, such as separate pages for a “women’s trainers” category and a “women’s” facet within the “trainers” category.

Every page that needs to drive organic search revenue needs its own URL. If it has keyword value and is a priority for your business, it needs a unique URL.

Facets should also generate crawlable pages, up to a combination of three facets. Without this step, you won’t be able to rank for product attribute combinations such as “red strapless peep-toes” or “26-inch bicycle wheels.” Keyword research will tell you how many and which attributes are valuable for organic search.

Likewise, the pages that need to drive organic search revenue must have indexable textual elements on them — not just images. Without text fields, there’s nothing to optimize to target page relevance that the search engines need to rank a page. The content should start as plain HTML text and then be progressively enhanced for modern browsers and devices.

Other important tips for ecommerce SEO include:

  • Optimize the system default title tags and meta descriptionsto improve the amount and type of information displayed, and to best represent your site’s brand at the end of the title tag.
  • Focus on Google’s mobile first index. Sites with responsive experiences will fare the best in Google’s new index, but optimally implemented mobile sites can still rank well.
  • Use structure product datalike price, ratings, and availability to enable rich snippets in search results.
  • Ensure that the site doesn’t generate soft 404 errors. Use a true 404 error if a file cannot be found.
  • Always implement 301 redirectswhen changing URL structure, removing content, and making any other major change to a site.

Content Management Systems

The ability to optimize content requires an optimal content management system. Otherwise, SEO professionals have to rely on developers (see below on how to improve SEO through developers).

“At a minimum, the CMS should enable manual overwrites of defaults for the basic SEO elements: title tags and meta descriptions.”

Also, the CMS should allow the editing of every textual content field on every page — headings, promotions, body copy, descriptions. If it’s text, it should be easily changed, without developer intervention. Pages should not inherit textual elements from their hierarchical parents.

The CMS should also enable the ability to modify key page-level technical attributes, such as keyword URLs and canonical tags, and for 301 redirects, as sunsetting older individual pages.

These same CMS requirements hold true for product information management systems, where descriptions and other product information are stored.

Access to CMSs and PIMs is sometimes gated by internal processes. An SEO team in large organizations rarely has access to accomplish all of these optimizations. However, enabling the business overall to make the changes relieves some of the pressure on the development staff and enables them to focus on more technically related updates.


Lastly, don’t forget to implement tracking. Without it, you won’t be able to respond to questions or make data-driven decisions, such as which areas to optimize first.

Make sure that there’s a web analytics program in place. You don’t necessarily need to implement SEO tags, as traffic from search engine domains is considered to be organic by default if it doesn’t trigger a paid search tag. However, it’s critical that other channels have tags place so that organic search isn’t overstated.

If the site is on a new domain, new subdomain, or new HTTPS protocol, create and verify a new site in Google Search Console and Bing Webmaster Tools immediately after launch. This is very important. These tools are the only way to receive communication and critical technical and performance data from these major search engines.

How Developers Can Improve SEO?

The relationship between web developers and search-engine-optimization teams is sometimes contentious. Seemingly unrelated technical decisions by developers can impact organic search traffic.

“There are enormous SEO benefits in working hand in hand with developers and their release planning and testing cycles, throughout. When SEO experts sit down with developers to discuss opportunities to drive revenue for the site, amazing things can happen” says Daniel Carnerero, CEO of Ennovators Group and former Darlings of Chelsea Director. “Building a strategy from the ground up with a holistic approach saves time and money in the long run, providing better returns and creating endless opportunities down the line. Most people underestimate the work of SEO strategist and developers early on, paying the price for years. When they operate in silos, the result tends to be far from ideal”

Easy ways to speed up SEO work

  1. Speed and security. Google values site speed and security. Both are part of the ranking algorithm. And SEO practitioners must rely on development teams to improve these areas. Remember, speed and security are customer experience issues, too. Your development team could already have them on the roadmap.
    1. In the case of site speed, developers likely have people harping on them companywide. Adding your voice to the mix isn’t likely to produce instant results. Instead, ask about their plans to improve the situation.
    2. Also, read up on what it takes to implement these changes — they’re not easy fixes — so that you can discuss it knowledgeably.
  2. Binding JavaScript. Search engines are increasingly adept at crawling JavaScript. Google claims that it can crawl anything you throw at it, though it still avoids hashtags and form fields. Still, if you want to ensure that the search engines can crawl your site, and associate signals of relevance and authority with each page, ask JavaScript developers to bind the destination URL with the anchor text. This creates a link that acts very similarly to plain HTML, sending all of the desired SEO signals.
  3. Duplicate content. Search engines bots have a crawl budget. They spend a limited amount of time on your site. If your site is full of duplicate content, the bots will waste time crawling those redundant URLs and the search engines will have a hard time identifying which pages should rank. They also won’t discover new content as quickly.
    1. There are many ways to resolve duplicate content. Each method comes with risks and benefits. Choosing a solution that is good for SEO and easily implemented by your development team requires discussion.
    2. An added benefit to that discussion is that your developers will likely be able to tell you what’s causing the duplicate content in the first place. You may be able to address the problem at its source.
  4. 301 redirects. Of critical importance when migrating content or redesigning a site, 301 redirects are also necessary for everyday actions. For example, when you change a category name from singular to plural it likely also changes the keyword URL. Unfortunately, 301 redirects are a pain for developers to write and test. Look for an auto-redirect solution for the small instances like this so that every changed URL instantly redirects. You won’t have to remember to request it, developers won’t have to implement it, and it will automatically protect your natural search performance.
  5. Crawl errors. Errors happen on every site. How many errors are “normal” depends on the type of error and the size of the site. For example, if Amazon had 1,000 404 file not found errors, it would be a drop in the bucket compared with the enormity of that site. But for a small ecommerce site with 100 products, even 20 errors would be a major problem.
    1. Excessive errors drive down search engines’ algorithmic perception of site quality. If your site has a high percentage of errors compared to its size, the chances increase that a searcher would land on an error page from the search results page. That makes both the search engine and your site look bad. Search engines don’t want to take that risk. Thus the more errors on your site, the less likely search engines are to rank it well.
    2. Site quality is an understandable sore spot for developers. They want a clean site, too. Reports of errors can make them defensive. Avoid subjective statements, such as “The site has a lot of errors,” and focus on the specifics. Come to the table with the data from Google Search Console or your web crawler that shows which pages are throwing errors, to quickly move to solutions.
  6. SEO self-sufficiency. Identify ways the SEO team can become self-sufficient, freeing developers from mundane work. Can you edit canonical tags yourself via a bulk upload instead of asking developers to do it? Can you 301 redirect individual pages as needed when promotions expire? Look for small tasks that take developers away from more strategic work. Ask if a solution could be built to enable someone on your team to do the task instead. It will probably result in faster implementation of these tasks since you won’t have to wait for them in a release cycle.


A website’s use of the secure protocol — as evidenced by the HTTPS designation and the presence of a security certificate — is a priority for search engines. But recent changes to Google Chrome make it important for web browsers, too.

HTTPS and Browser Behaviour

Google is tackling the security challenge across the web via its Chrome browser. This week, the Chrome team announced that all web pages without an HTTPS certificate will display a “Not Secure” label in the latest Chrome version, scheduled to be launched in July 2018. Previously, the “Not Secure” label had been reserved for pages with form fields, such as password logins, shopping cart checkouts, and site-search boxes.

While browser behavior isn’t typically considered natural search marketing, it does impact searchers’ perception of the quality of your company.

Natural search is an excellent way to expose more people to your brand, and rekindle the interest in those who already know of it. The experience they encounter on their visit to your site will create a strong impression.

Seeing a warning in their browser that the site is not secure can signal that it is risky or low quality. It could cause the searcher to leave and choose a competitor.

Moreover, an increase in quick bounces could trigger in search engines a rankings decrease, based on the presumption that searchers are not finding what they want.

Thus a change in the security labeling for a widely used browser like Chrome could impact natural search performance.

In addition to Google’s heightened Chrome warnings, Safari and Firefox, the other leading browsers, also provide security signals in their address bars.

Rankings Boost

Google has long been vocal in its push for a more secure internet.

Google’s rankings boost associated with HTTPS is based purely on the letters “HTTPS” in the URL, as opposed to the existence of a valid certificate. That means that even insecure pages could receive a rankings boost if they masquerade behind an HTTPS notation. This seems like a pretty weak effort from the search giant, but it reportedly decided to focus on other areas in the algorithm.

Still, in the competitive world of search engine optimization, even a small boost in rankings could give you the lead over a competitor and create additional natural search revenue.

Bing, the world’s second largest search engine across desktop and mobile devices combined, considers the security of the site to be a matter of the owner’s choice, rather than a ranking signal. Bing does not, reportedly, provide a rankings boost for HTTPS.

Migrating to HTTPS

Migrating to HTTPS isn’t as simple as purchasing a security certificate. Everything, from images to links to 301 redirects, is involved. The process is often complicated.

Much of the complexity involves SEO, as the migration should preserve organic rankings. It should be treated as any other URL change or technical migration.

Search engines consider HTTP and HTTPS as different URLs. Thus HTTP and HTTPS can be indexed simultaneously and create duplicate content that competes for rankings and splits link authority between different versions of the site.

As such, make sure to register and verify in Google Search Console and Bing Webmaster Tools the HTTPS versions of the site and all subdomains — including www and non-www. The search engines use these tools to send you messages about your site’s performance and crawl-ability, which will be vitally important as you migrate.

error: Content is protected !!