Google Phantom 2 Update: What it is and How to Deal With It

If you’re keeping tabs on search engine news at all, there’s no way you would have missed last month’s non-event “Mobilegeddon”  – Google’s mobile-friendly algorithm update launch. April 21st came and went but the drastic shakeup in the SERPs that many anticipated never happened. In Mobilegeddon’s vacuum, however, Google managed to slip a major algorithm update that went largely unnoticed.

Well, almost unnoticed.

Thanks to Glenn Gabe’s analysis of organic search trends from April 29th onwards, we now know that Google snuck in a game-changing update that significantly  impacted organic search visibility for a lot of websites.  Google would initially not comment on the topic but stated that neither Panda nor Penguin refreshes took place. Gabe later referred to the update as “Phantom 2” due to its stealthy rollout and Google’s nebulous attitude towards the subject.

The company would later confirm to Search Engine Land that an update did take place and it reiterated that it wasn’t the latest hit from filters such as Panda or Penguin. The search giant stated that they were making changes to how their core algorithm evaluates content quality signals but would provide no further details.

So what is Phantom Exactly?

The short answer is that nobody (except for Google) knows the update’s exact nature at this point. However, we have a few facts in hand that we can work with to form educated opinions about what’s going on:

  • Phantom is a core algorithm update. It’s not a rolling filter like Panda and Penguin which are add-ons designed to whack spammy search results out of the first few pages of Google’s SERPs.
  • Phantom is a core algorithm tweak, which may imply that it’s an “always on” part of how Google ranks pages.
  • This is not the first time Phantom-like ranking fluctuations happened. In early May, 2013 similar changes have been observed by the SEO community.
  • Phantom seems to impact rankings on a sitewide basis as seen in HubPages’ analysis after they were hit. If this update detects a certain amount of thin or spammy pages, ranking demerits may fall on all your content regardless of their individual quality levels.
  • Pages that were struck by Panda in the past seem to carry the most risk
  • There’s no evidence that mobile-friendliness and link profiles have anything to do with Phantom
  • Old Panda triggers are still the enemy. Gabe notes that indexed tag pages, click bait articles, low-quality supplementary content, unoriginal content, ad-heavy pages and stacked videos were rampant in sites negatively impacted by Phantom.
  • “How to” websites are notable examples of Phantom-stricken sites. However, Barry Schwartz of Search Engine Land points out that sites from other industries are also experiencing organic visibility changes.

Given all of that, it’s interesting to see how sites like HubPages, eHow, WikiHow and Answers.com are seeing their organic search visibility and traffic go down at a time when Google is pushing its Knowledge Graph and Direct Answers hard. Coincidence or by design? I’ll leave that question to you.

My blog, for its part, didn’t get a big organic traffic surge but I did notice significant ranking boosts for some competitive keywords that I want to rank for. Ultimately, I believe that this update is a good thing due to the fact that it forces site owners to look at their content more closely and to invest more heavily on creating unique, expertly-written and robust content assets.

Google Ghostbuster: Preventing and Dealing with Phantom

The SEO community is still largely in the dark about this update but what we do know is that it rewards sites that consistently produce good content while punishing ones that harbor low-quality pages. In that regard, the old lessons from dealing with Panda are still likely to be effective.

Making sure only index-worthy pages are accessible by Googlebot, applying professional editorial practices, being ethical with advertising policies and putting user experience ahead of everything else are the hallmarks of a quality website. Here are some areas you can look at if you think you’ve been hit by Phantom or if you want to address potential issues before your site is sanctioned:

Address Pages with Thin Content

Thin content refers to the kind of content that carries no unique and inherent value. Though some marketers assume that thin content are pages with low word counts, brevity doesn’t necessarily equate to thinness. If a page’s content is mostly fluffy or unoriginal, Google can still view it as thin even if the word count is high.

Doorway pages, curated content that doesn’t contribute anything new to a body of knowledge, scraped content and (yes) pages that lightly discuss a topic can put you at risk. I wrote a full guide on finding and addressing thin content a few months ago which might help you deal with potential issues on content substance.

To avoid accumulating thin content, write material that discusses topics comprehensively. Answer as many possible questions as you can in order to satisfy a wide range of searcher intents. Whenever possible, get a subject matter expert to write the piece. There’s no substitute for insights that carry years of experience behind each statement. Experts are also less likely to borrow and mash up other people’s content – they have all the ideas they need in their heads.

Be Judicious with What You Have Google Indexing

Not all your pages can carry rich content. Some pages will inherently have thin content because it’s necessary for delivering good user experience. Login pages don’t have to say anything except where to enter your username and password. Content syndication can be a perfectly legitimate means of providing value to readers. Ecommerce sites can have lots of different variants of the same product which might warrant their own pages and descriptions each.

All these content types may be considered thin, but you don’t have to get rid of them just because they might end up harming your rankings. If these pages are helping your site deliver the right kind of experience for your users, just make sure not to have Google index them and you should be fine.

Contrary to what some SEOs assume, having more of your pages indexed isn’t necessarily a good thing. By all accounts, Google seems to disfavor sites that clutter up its SERPs with low-quality listings. For best results, just have Google index the pages that are more likely to delight searchers. Use your site’s robots.txt file or the “noindex” meta tag to tell Google not to index pages that may potentially deem as thin.

Don’t Be Too Aggressive with Ads


Excessive ads

As the case was with Panda, review your site’s policy when it comes to the placement of ads. As much as possible, stay away from pop-ups, ads that blend with the main content of pages, and design layouts where the page body is smothered by ads on the header, sidebars and footers.

Google seems to have no issues with sites that make advertising their primary means of generating income as long as the ads don’t get in the way of user experience. To avoid being hit by a quality update on these grounds, keep ads to one or two sections of your pages. Dedicated banner areas and right sidebars often make sense. You can also convince advertisers to buy placement on your email blasts, videos, whitepapers, newsletters and other content assets that search engines don’t crawl.

Hire Professional Writers

Writing

It’s not rocket science, really. If you want good content, you need to hire legitimate writers. Professionals have the skills for crafting compelling hooks, writing with proper grammar, setting up good idea flows and issuing persuasive calls to action. A site’s content is essentially its soul: if you don’t make it a primary investment, it begs the question of why you’re in the game at all.

Avoid being a penny pincher who hires writers from cheap, fly-by-night sources. Hire writers who impress you and understands the nature of your online business. If you prefer working with agencies, find one that balances costs and output quality.

In a lot of cases, writing the content yourself is a good idea. Just make sure to follow some basic but effective writing best practices to consistently deliver content that delights your audience. Some quick tips include:

  • Write about topics, not keywords
  • Citing and linking back to sources
  • Using swipe files to produce attractive headlines quickly
  • Avoiding the use of big words
  • Using the active voice whenever possible
  • Breaking up long paragraphs into smaller chunks

Treat User Content with Caution

Content is content. Whether you created it or you allowed members of your audience to post their own material on your pages, Google still views it as your site’s content. You may be creating perfectly good content assets, but if your users are posting spammy and thin pages, your site could still get hit by Phantom or Panda.

When you allow user-generated content to get published within your domain, you better take responsibility for moderating it. Approving comments that add value to the discussion of a topic is a good practice. Accepting only guest posts that your audience will find interesting is another.

If your site allows users to create subdomains where they can build their own content hubs, set editorial and usage guidelines that will keep people honest. The last thing you want to happen is to have hordes of pornographic, weapons-related and plagiarized content being indexed and being counted against you.

I’ll follow Phantom-related closely and I’ll update this post as soon as more information is available. For now, focus on keeping quality high and explore parts of your site that may be at risk.

Glen Dimaandal
Glen Dimaandal
Glen Dimaandal is the founder and CEO of SearchWorks.Ph. He has been doing SEO since 2008 and is consistently featured in mainstream media and industry conferences. His core skills include SEO, SEM, data analytics and business development.
Glen Dimaandal
Glen Dimaandal
Glen Dimaandal is the founder and CEO of SearchWorks.Ph. He has been doing SEO since 2008 and is consistently featured in mainstream media and industry conferences. His core skills include SEO, SEM, data analytics and business development.