Meta Robots Tag Audit Guide

A meta robots tag is a short snippet of HTML code at the head section of each webpage that instructs search engine robots what they can or can’t do. There are many types of meta robots tags, but the most commonly used are the noindex, noarchive, and nofollow tags. If you do search engine optimization, particularly technical SEO, you will already

Site Speed Optimization For Non-Coders

Site speed is an important ranking factor for Google and an overlooked aspect of search engine optimization. To reiterate: the better your site speed, the better your site’s chances of climbing up the search engine results pages (SERPs). Google’s prioritization of site speed makes sense as faster-loading sites tend to mean a better visitor experience

PageSpeed Insights-100

How to Score 100 in PageSpeed Insights

Site speed is one of the strongest ranking signals under the technical SEO umbrella. The rate at which your pages fully load on browsers is a hallmark of solid performance which Google likes to reward in the sites it indexes. We recently deployed a pilot version of our new property Mediko.Ph, a medical information resource

howto-overview

Google Officially Launches HowTo and FAQ Markups

Good news for bloggers and content creators who like sharing process guides (like me!) – Google has announced the official launch of the HowTo and FAQ schema markups. Now, content that share step-by-step processes and frequently asked questions can be declared using structured data so Google and other search engines can easily identify them. Announced

Yoast SEO

Yoast SEO 11.1 Adds Video and Image Schema Support

The best just got a little better as Yoast SEO version 11.1 added support for Schema.org markups for videos and images. With this enhancement, you no longer have to manually code structured data to your media assets in order to help search engines better understand what they’re all about. The latest version of the plugin

GlenDemands XML SItemap

XML Sitemaps: Everything You Need to Know

  An XML sitemap is a file that helps search engines crawl and index a website’s pages more intelligently. Through this protocol, web crawlers can locate more URLs, determine their place in the site’s information architecture and understand their level of importance in the site’s hierarchy of pages.

Paper Robots

Robots.txt: All You Need to Know

Robots.txt is the standard means by which websites tell search engine spiders and other crawlers which pages are open to scanning and which ones are off-limits. Also known as the robots exclusion standard or robots exclusion protocol, it’s used by most websites today and honored by most web crawlers. The protocol is often used on

Rocket Booster

How to Boost Organic Traffic by Cleaning Up Crawl Errors

In web development and SEO parlance, a crawl error happens when a search spider fails to access a webpage or an entire website. Technical issues, content management mishaps or improperly configured bot restrictions could lead to the occurrence of these errors. While most sites have crawl errors in small quantities and are virtually unaffected by

Data Highlighter

Data Highlighter: a Hidden Gem in Search Console

Adding Schema markup microdata to our webpages is helpful in acquiring more organic traffic for our sites. Though Google has not declared it as a ranking factor, the search giant has acknowledged that it uses structured data to better understand the HTML elements of a site so it can use them for rich snippets.