A couple of weeks ago, SearchMetrics released its annual Ranking Factors Study. Like past iterations, the 2014 study used the top 30 ranking pages for 10,000 keyword queries on Google to test the correlation between ranking factors and the position of pages within the SERPs. After processing the data from about 300,000 URLs, SearchMetrics produced more evidence that SEO is entering a whole new age.
Highlights of the study include the downtrend of keyword-based content and the rise of a more holistic approach that involves the coverage of entire topics. The data also shows that brands are given premium treatment when it comes to top spots in the SERPs. Links and social signals still correlate very well with SEO success, but the study demonstrates that these are just components of a much bigger whole.
In this post, we’ll discuss the highlights of the study as well as actions that SEOs can take to succeed in the new order.
Correlation Doesn’t Imply Causation
Correlation, of course, does not necessarily imply causation. Correlation is a measure of the strength of a relationship between two variables. For instance, high food intake is correlated with obesity. However, it doesn’t necessarily mean that people who eat a lot are going to become obese. Some people are obese due to rare diseases. Some people can eat a lot and never gain weight due to fast metabolisms.
By the same token, correlations between ranking factors and strong rankings can’t be taken as definitive proof that a factor is causing the favorable placement on search results. It only means that pages performing well in the SERPs usually score well for that particular ranking factor.
Now that the math refresher is out of the way, let’s get to the matter at hand…
SEO Ranking Factors 2014 in Summary
If you haven’t had the chance to read through the full study (and I would understand why), let’s do a quick review of what it’s telling us. The infographic below shows the top ranking factors of 2014 using a correlation bar chart enhanced with bubbles for quick reference. These figures were worked out from SearchMetrics’ interpretations of averages. Click on the image for a full view.
The data shows that the top two search results usually feature sites that represent well-established brands in the keyword’s niche. For top results, brand power supercedes simple ranking factors. This is in line with the belief that Google tends to favor brands because it trusts that they’re creating content based with user experience – not rankings – as the main consideration.
Since content quality is a subjective topic, SearchMetrics analyzed the content of the webpages in its sample set using Proof and Relevant terms. Proof terms are words that are highly related to the main keywords used in the webpage. Relevant terms are not quite as closely related to the keywords, but they’re likely to appear in a sub-section of the same document.
For instance, a webpage about shoes is very likely to mention Proof Terms like sneakers, footwear and boots. Relevant terms may include socks, ankles and running. This proves that search engines have further departed themselves from keyword-focused content and are now rewarding pages that use natural variations in terms, sentence constructions and writing styles. It’s all geared towards a more natural reading experience for the user and a better transfer of ideas overall.
This doesn’t necessarily mean that keyword usage should be avoided. As SearchEngineLand points out, “developing keywords to topics to generate holistic content” is the way to go.
Links are still a major factor in the assessment of a webpage’s relevance to a query. Curiously enough, the correlation between good rankings and the presence of keywords in inbound links has increased but the usage of keyword-infused anchor text has decreased overall. This doesn’t necessarily mean that we should start jumping back on the anchor text link bandwagon. To the contrary, Marcus Tober of SearchMetrics pointed out in his Moz post:
“in fact, the average share of links featuring the keyword in the anchor text has declined from 2013 to 2014 (from ~40% to ~27). But what you see is a falling graph in 2014 which is why the correlation is more positive with regard to better rankings. That means: the better the position of a URL is, the higher the share of backlinks that contain the keyword (on average). On average, this share continuously decreases with each position. In contrast to last year’s curve, this results in the calculation of a high(er) positive correlation.”
Good information architecture is still the most important part of on-site SEO. Logical internal linking schemes that help users navigate a site easily also facilitates better crawls from bots. Easy page discovery, fast load times and a site taxonomy that places a premium on topical relevance are at the heart of on-page excellence.
The study’s authors also point out that diversity in content correlates with better rankings. The use of images and videos to supplement text is evidently a good practice. SearchMetrics also emphasizes reading ease as an element that correlates well with strong rankings.
Social signals are still strongly correlated with good placement of the SERPs. Google seems to still be Google+’s own biggest fan as we see that +1s seemingly have an edge over Facebook likes, Tweets and Pins. Interestingly, Facebook-related signals have a good degree of correlation with good rankings despite the social network not being a known Google ally. Still, correlation doesn’t imply causation and Facebook signals could just be an indication of a page’s overall reach.
Relevance, authority, branding and reach are powerful ranking signals but Google seems to be placing more emphasis on what users actually do when your listings get ranked and clicked on. Click-through rate has one of the strongest correlations with good rankings as does the time that users spend on a site. As the case was with previous studies, bounce rates in top-rated sites is well below 50% on average.
Overall, what we saw from the Ranking Factors study isn’t exactly new. If anything, the data just provides more evidence to what some experts have already written about succeeding in the current age of Google. A holistic, natural approach that relies on acting like a publisher and building a legitimate audience is still the best (and safest) way to get in Google’s good graces.
Applying the Lessons
SearchMetrics’ data is nice to know, but if it doesn’t mean much if it doesn’t translate into actions that we can apply to our own SEO programs. Here’s a quick list of practices you can start applying quickly to win in Google moving forward:
Make Branding Your Top Priority
Having seen how Google seems willing to downplay regular ranking factors to favor brands in its top search results, we should all come to terms with the fact that search engine dominance requires legitimate branding. As I stated earlier, Google plays favorites with brands because it trusts them to make user experience the main consideration for publishing decisions. It simply follows that we have to build up our brands if we want some of a stake in premium SERPs real estate.
Establishing your brand involves the following elements:
- Thought leadership
- Messaging consistency
- Community focus
- Practicing all of the above in and out of the Web
Luke Summerfield has a nice post over at Entrepreneur.com on how to build a brand that people will gravitate to. I suggest checking it out as well as the references he cites.
Stop Focusing on Keywords
Let’s get one thing straight: the decline of keyword focus as a ranking factor doesn’t mean that we should stop using them. Keywords are essential to the semantic expressions of ideas. You can still have them in your URL slugs, titles, headings and anchor text. Where this can go wrong, however, is with the compulsion of some SEOs to over-optimize. Keyword stuffing is bad and you should avoid it at all times.
In the old school of SEO, this was the process that yielded a lot of success:
That’s no longer the case today as search engines have gone far beyond valuating keyword relevance based on repetition and link equity alone. You can’t just build a keyword-focused page and link-build your way to victory. As Google gets better at valuating other ranking factors, we’ll continue to see keyword-laced pages decline in search engine visibility.
Bottom line: focus on natural writing that’s aimed towards the reader and not the bots. I discussed how you can improve reader experience with a post I did recently.
Start Thinking Like a Publisher
This year’s biggest takeaway is content’s sustained rise as the basis of all ranking factors. Length, breadth and substance of content correlates with with the good rankings of the pages in the study. On that note, Tober recommends writing based on topics and not keywords.
Instead of building several pages that each focus on one keyword, use your time and energy to build one comprehensive resource page, then surround it with subpages that discuss related topics. Content marketers tend to do a very good job at this as they focus on the sustained creation of multiple assets. Good content marketers shed light on everything there is to know about their subject matter, making for positive user experiences and better search visibility.
Even if you’re not a content marketer, you can still create content with good substance and a holistic scope. A nice example of this would be how Bruce Clay Inc does it. If you check out their main navigation menu, you can see that SEO, PPC, Analytics and other service categories are siloed. Each one has its own resources, tools, services and tutorials subpages for better organization and a hifher defree of topical relevance between pages.
This helps search engines understand which section is about which topic. Combined with their branding, tenure and authority, Bruce Clay Inc has enjoyed a prominent position in the SERPs for years and years.
Usage Data is Critical
It’s not exactly press-stopping news, but usage data remains as one of the ranking factors that are most strongly correlated with strength in the SERPs. How your listing performs with respect to its ranking (click-through rate) and how well your content engages the users that click on the listing (time on site, bounce rate) gives Google a better feel on whether you deserve to rank high or not.
Increasing click-through rates is the first step towards getting better usage signals. This is a function of writing better title tags and meta descriptions that sum up the page’s content accurately while matching user intent.
- Title Tags – Focus on delivering a definitive summary in 60 characters or less. Using your main keyword here is good practice but if it gets in the way of context, don’t be afraid to use alternative words. Focus on the central idea and support it with words to drive a brief and snappy point.
- Meta Descriptions – It’s not a direct ranking factor, but the meta description can impact click-through rates negatively if it doiesn’t capture the spirit of the page. It’s usually a better idea to custom-fill this field rather than letting your CMS auto-populate it.
Improving average time on site and lowering your bounce rate go hand in hand. Both are functions of usability and content quality. Here are a few quick tips on getting better with these usage signals:
- Make Good on Promises – Title tags and meta descriptions are viewed by users as the tip of your page’s content iceberg. Make sure to set only the right expectations with what you write. Never promise anything that the content doesn’t actually deliver. Userss are smart enough to hit the “back” button within seconds if you try to mislead them. In the end, you’ll receive a few cheap visits but those will come at the expense of a bad reputation and possibly weaker rankings.
- Show the Content Above the Fold – Users will expect to find the content you promised as soon as the page loads. Make sure it’s immediately visible and not obscured by graphics, ads or pop-ups. A lot of readers will not have the patience to scroll down or dismiss on-screen objects. Most of the will leave and get the information they want elsewhere.
- Avoid Using Doorway Pages – A doorway page is a landing page that’s optimized for search but doesn’t really contain the content that it promises. It instead gives you a link to the actual content page and possibly a number of other pages. Needless to say, this is bad for user experience and it should be avoided to steer clear of high bounce rates, low session durations and possible Panda 4.0 filter triggers.
- Make Navigation Paths Clear – Having a neat, well-organized and logical navigation scheme facilitates more clicking and enhanced content consumption for users. Exercise good taxonomy in your menus and keep your search box handy to encourage more user activity.
- Always Have a Call to Action – Even if you’re not selling anything or trying to opt people into your mailing list, you should still issue calls to action. Simple invitations to read related articles or to download assets are sufficient calls to action that will lengthen the stay of visitors and reduce bounces.
Build Links Naturally
Even as Google gets better at identifying brands, understanding semantics and factoring social signals into the SERPs, the value of backlinks is still undeniable. Natural links from quality sites still yield a high impact on a listing’s placement on the rankings as shown by the SearchMetrics study.
Link building in itself isn’t bad but doing a good job at it requires more sophistication now. Scalable tactics such as guest blogging and broken link building are getting more difficult due to saturation from the SEO community. Site owners are increasingly wary of penalties from aggressive link builders, giving birth to an SEO landscape where the best link builders are advocating the use of quality content to generate legitimate links.
Here are some sound practices to keep your site safe as you solicit or attempt to attract links:
- Keywords in anchor text aren’t bad as long as you do it within context and without heavy concentration on a single term.
- Leverage real-life connections for good links. Industry friends will give you the quickest and cleanest links.
- Uniquely useful content attracts the most links. Write about topics in your industry that haven’t been sufficiently discussed by other sites. If most topics have been covered, find fresh angles to present the same topics.
- Press releases yield some of the best links. If your release is truly newsworthy, major news publications can pick it up and link back to you.
- Thought leadership opens up interview opportunities with accompanying links. It also sets you up for guest posting invitations on influential sites.
- Creating assets like tools, whitepapers, reports and templates will attract links from users who’ve been looking for solutions to their problems.
At the end of the day, the correlation between ranking factors and good rankings should serve not as a checklist for what we have to do as SEOs, but as a guide on what Google defines as a quality site. If you read between the lines, you’ll realize that content, technical health, social signals and usage signals are just indicators of something much bigger: the quality of user experience. That brings us full circle to what Google’s Matt Cutts always preaches: build a good site that rewards visitors and Google will do its best to give you better visibility in its rankings.