SEO is ever-evolving. RankBrain (Google’s machine learning technology) never sleeps, providing Google’s Hummingbird algorithm with ever fresh insights for optimization ideas. Information fast becomes outdated. As a result, I wanted to take some time today to update you on the advancements in search engine optimization, that has happened since the first “200 Ranking Factors” blogs were posted and reposted.
In a nutshell: Google is most interested in excellent content, quality links, and relevancy; in other words, in providing a great user experience! If you follow these principles, your website should stand a pretty good chance.
If you’re more interested in the nitty-gritty, I’ll be providing an updated and detailed overview of Google’s known ranking factors below and debunk a few of the most common SEO myths.
1. URL Length
The best practice when it comes to URLs is: “Keep them short and sweet.” Longer URLs may hurt your webpage’s search engine ranking.
2. URL Path
Pages that are located closer to the homepage might see a slight increase in rankings. This could be related to the URL length, as pages closer to the homepage usually have shorter URL names as their sub-pages.
3. URL String
URL string parameters are read by Google help the search engine to categorize the page. It is therefore important not to use parameters such as ?=1234 and instead replace these with content that humans and machines can read.
4. Page URL contains hyphens instead of underscore
Google sees hyphens as separators, while words separated by underscores are joined together, as explained by Matt Cutts in 2011 in his video “Underscores vs. dashes in URLs.”
5. Keyword Location
As we’ve already discussed, keywords are important for search engines to determine relevance. The significance of keyword location goes beyond placement in the title tag and headlines. As a rule of thumb: The higher up the keyword appears on a page, the better. A best practice is therefore to place it within the first 100 words of a page’s content.
6. Keyword in the Title
Having the keyword in the title is an important SEO tactic. It signals that your page is relevant to a particular search. In 2016, Google has expanded the character limit for the title tag to 70 characters, which is also true for mobile. Titles exceeding this limit are truncated. However, remember that Google uses pixel length, so the actual length could vary. I would recommend keeping the title under 60 characters, however.
7. Start Your Title with a Keyword
Starting your page title with a keyword is an important factor for SEO. If your website title still has the word “Home” or “Homepage” in it, consider switching it to end as follows: | Brand Name
8. Keyword in Description Tag
The description of your page is another relevancy indicator for search engines and hence an important SEO factor. As of December 2017, you are currently limited to around 320 characters, however, I would recommend sticking to the previous 160 character limit if you can.
Make sure that the keyword or a close variant of the keyword is included in the description tag, but avoid falling into the trap of “keyword stuffing,” as this practice will make you look spammy in the eyes of Google and other search engines (e.g. Bing and Yahoo).
9. Keyword in H1 Tag
H1 tags are important relevancy indicators for search engine bots and yes, you can have more than one, as this 2009 video of Matt Cutts on H1 Tags clarifies. The important fact to remember is that the amount of H1 tags is not as important as the use of such tags. Hence, use them accurately, rather than for styling purposes.
10. Keyword in H2, H3 Tags
Having your keyword appear in the H1 tag(s) is crucial but don’t forget your H2 and H3 subheadings.
11. Keyword Density
Having a keyword appear more than any other likely acts as a relevancy signal. To paraphrase a 2011 Matt Cutt’s explanation of keyword density, the first few times the keyword is being used it weighs heavily as a relevancy indicator, however, the weight of it declines over time, as keyword usage increases. The reason behind this might be Google’s attempt to counteract keyword stuffing. The best course of action is to use synonyms (yes, search engines understand synonyms) and to ensure that the text still reads well.
In other words, keyword density is important, but there is no “optimal” keyword density rate.
12. Keyword Word Order
If you were to type “on-site SEO strategy” into Google, would you rather click on a webpage with exactly this title or one that has these words inverted, such as “strategy for on-site SEO”? Again, thinking of relevancy, Google considers the exact match title to be more relevant to the search. This highlights just how important keyword research is.
13. Content Length
Content with more words can cover a wider breadth and are likely preferred to shorter superficial articles. SERPIQ found that content length correlated with SERP position
In 2012, serpIQ conducted a study exploring how content length affects page ranking. Posts with less than 2,000 words ranked #10 or below. To land in the #1 spot, a post needed on average 2,416 words. However, don’t let this number mislead you into thinking that quantity goes before quality. For some of my clients, I see high rankings even if their posts do not reach the 2,000 word mark. Content length is, after all, only one small factor in SEO.
14. Latent Semantic Indexing
Latent Semantic Indexing (LSI) is a mathematical method used to help search engines understand your page. Let’s say you have the word Apple in the title of your page, a search engine might not readily know if this term stands for the fruit or the company. As the search engine crawler indexes that page, it will look for the most common words and phrases on that page to determine important keywords.
LSI keywords on your page help the bots extract meaning by establishing relationships between words. This is one of the reason why using synonyms throughout your content is important. Latent Semantic Indexing was born to counteract keyword stuffing.
I highly recommend against the use of article spinning softwares, that claim to help with LSI. What they produce is usually just a lot of unreadable garbage — when what you’re aiming for is quality content.
15. LSI in Meta Tags and URLs
LSI keywords in page meta tags and URLs can help Google identify meaning and help with identifying relevancy.
16. Page Loading Speed
Eager to provide the best user experience, Google has been using page loading speed as a ranking factor since 2010. The ideal load time is below 2 seconds (200 ms). If you are eager to find out how you perform on that front, you can use Google’s free PageSpeed Insights.
Google can determine page load time by looking at file size and other HTML indicators. It may also use Chrome user data identify a page’s loading time.
17. Duplicate Content
Many sources suggest that duplicate content negatively affects search engine rankings. While there certainly is some truth to this, anyone who has ever taken the time to read Google’s guidelines on duplicate content will notice that this refers mostly to content that “is deceptive in origin.”
This distinction becomes strikingly evident when you search for a popular topic on Google (e.g. a news report). Sometimes the entire first page is filled with the same article, published under various websites. That said, Google’s algorithm is fairly advanced in distinguishing original from scrapped content, so the common obsession with content scrapers seems unjustified.
I would recommend holding yourself to a high standard without losing sleep over content scrapers that don’t follow the same values.
18. Recency of Content Updates
With the 2010 Google Caffeine update, Google has started to place more importance on recency, especially for time-sensitive searches.
19. Significance of Content Updates
The significance of edits and changes is also considered as a SEO factor. Adding or removing entire paragraphs on your page is certainly considered more significant than adding or removing a few words.
20. Historical Updates
When determining relevancy Google also looks at update history. Could a page that has not been updated in over 10 years still have the same relevance for the reader as it had a decade ago? Maybe not, whereas more recent updates are more likely to meet the readers needs.
21. Readability Score
When optimizing your website for search engines keep in mind that Google, Bing, yahoo, and the like care about user experience. User-focused and user-friendly content is paramount to SEO success. People usually prefer to read simple language and search engines have recognized this. Here are some factors that determine readability:
- sentence length
- paragraph length
- use of transition words
- use of subheadings
- lack of passive voice
22. Grammar and Spelling
Grammar and spelling are often important quality indicators for readers and many SEO professionals assume they are part of Google’s search ranking criteria. However, Matt Cutts indicated in his 2011 video on Grammar and Spelling that they are not part of the ranking criteria for search engine optimization.
23. Helpful Supplementary Content
Helpful supplementary content, such as currency converters, can be an indicator of a page’s quality and thus enhance SEO ranking.
24. Multimedia Elements
Including images, videos and other multimedia in your pages can act as a content quality signal and thus positively affect search engine ranking. That said, these elements can negatively affect page load time. It is therefore important to strike the right balance.
Images are often important for users to understand a page, however, search engines cannot readily “understand” an image. Optimizing images by changing their file name (before the file upload) to something legible (e.g. “[company name] logo” rather than “IMG_20171215_123456789”), including image alt texts, descriptions, and captions, is important for SEO.
26. Alt Tag for Image Links
For hyperlinked images, the alt text is an image’s version of anchor text.
27. Bullet Points and Numbered Lists
Although not confirmed by Google, some SEO professionals recommend bullets and numbered lists to make your content more visually appealing and easier to consume.
28. References and Sources
Citing references and sources may be a sign of quality. However, it is not confirmed that these factors are used for ranking. It is a good practice, however, to reference your sources accurately, because if you are violating copyright, Google’s attention is the least you have to worry about.
29. Page Age
Older pages that are regularly updated may outperform a newer page. That said, the influence of page age seems to be minimal and it is more important what kind of updates you are performing, how frequent these updates are, and if you are regularly adding valuable content. As such, even younger and smaller pages can compete in SEO rankings if they focus on quality, relevancy, and user experience.
30. Page Category
Categories are not only important for structuring your content, they also impact the ranking of pages that belong to this category.
31. WordPress Tags
Tags are WordPress-specific relevancy signal that allow the author of that page to group the site’s content together.
32. Presence of Sitemap
Because search engines cannot index your pages if no link leads to them, having a sitemap helps search engines with indexing your website.
33. Breadcrumb Navigation
While breadcrumb navigation might not in itself be a ranking factor, this kind of set-up allows search engines understand your website structure and index your pages.
34. Number of Pages
According to a 2013 video Matt Cutt’s posted on the relationship between indexed pages and SEO, the number of indexed pages does not directly influence search engine ranking. That said, the more pages you have that cover the variety of search term queries a user might have, the higher your chances of ranking well for those searches.
35. User-Friendly Design
As already discussed, user-friendly layout is a sign of quality and user experience, two factors that play heavily into search engine ranking.
36. Site Structure
A well thought-out and implemented site structure allows Google to thematically organize and index your content.
37. Site Updates
Just like we’ve already discussed on the page level, how often a site is updated — especially when these updates are major updates — is a factor for determining relevance.
38. Valuable Content
Not just on the page level, but the entire site-level, Google is looking for valuable content. Websites with poor content, such as thin affiliate sites will occupy lower spots in Google’s ranking.
39. Quantity of Other Keywords
Page that also rank high for various related (!) keywords, could be perceived as having a higher quality.
40. Server Uptime
Significant server downtime may affect your search engine ranking negatively, due to the resulting indexing problems and user experience issues.
41. Server Location
Especially in geo-specific searches, the location of your server can affect your ranking. In areas that have access to high-speed internet, the effect will be minimal. However, Google recommends assuming a slower internet connection speed when optimizing your site to truly create the best outcome possible for potential users.
42. SSL Certificate
Again, in their effort to enhance user experience, Google confirmed in 2014 that they use SSL certificates (https website protocols) as a ranking signal.
43. Contact Us Page
Google’s Search Quality Rating document states that “contact information and customer service information are extremely important for websites that handle money, such as stores, banks, credit card companies, etc. Users need a way to ask questions or get help when a problem occurs.” It is presumably also important as SEO ranking factor as a result.
44. Terms of Service
Similarly to the Contact Us Page, pages related to an organization’s terms of service or privacy policies could potentially help Google assess the trustworthy of the organization in question.
45. Website’s Meta Information
Having duplicated meta information on your site can have a very negative impact on your entire website. You can search for these in your Google Search Console under Search Appearance > HTML Improvements.
46. HTML errors
Coding issues such as HTML errors are signs of a poor quality website and can affect your search engine ranking negatively.
47. Google Analytics Data
Many SEO professionals claim that Google Analytics data, such as bounce rate and page on site directly influence a website’s search engine ranking. This is NOT the case. Matt Cutts busted these myths in his 2010 video on Google Analytics and SEO. In fact, not only does Google Search NOT use Google Analytics data, due to privacy concerns the Google Search team never even synchs up data with the Google Analytics team.
48. User Reviews
Some SEO professionals claim that review sites like Yelp likely play an important role in Google’s algorithm. If looking at the actual link that a site like Yelp uses to point to the business’ website, the no-follow rule stands out. Therefore, it is not possible that Yelp links to a business’ website indeed helps with Google ranking.
In 2010, Google also investigated a report of a business that claimed that reviews influenced its rankings and optimized its algorithms even further to ensure that reviews (positive as well as negative in nature) do not impact rankings.
49. Page’s PageRank
Some SEO professionals still indicate that pages with higher PageRank tend to rank better than those with lower PageRank. However, in 2013 Google declared that “PageRank is something that we haven’t updated for over a year now, and we’re probably not going to be updating it again gong forward, at least the Toolbar version.” It has in the meantime, indeed been phased out.
50. Page Host’s Domain Authority
A page on an authoritative domain can indeed have a positive advantage in search engine rankings, all things being equal.
Interestingly, TrustRank is yet myth that has already been debunked. Many SEO professionals claim that TrustRank plays a “massive” part in search engine rankings. Yet, on various occasions, dating back as early as 2007, Google clarified that it does not use TrustRank, a word mark (not a patent) they had filed in 2005.
In 2011, Matt Cutt also published another video explaining the factor of “trust” and how it is assessed for search engine ranking. In short, “trust” is an umbrella term used to summarize multiple ranking factors that we’ve already covered in this blog post.
And while we’re at it, if you looked at the actual trademark application, you would notice that it says “Abandonment Date February 29, 2008.” Hence, the trademark application was abandoned and there is no patent filed from Google.
52. Parked Domains
Since December 2011, Google has decreased search visibility of parked domains.
53. Human Editors
In 2000, Google filed a patent for a system that allows human editors to influence the SERPs and as always when Google publishes a patent, the SEO world starts speculating. This time, the speculation revolved around Google “hard-coding” search engine results and has also, already been debunked.
On various occasions, Google has confirmed, however, that it does not hard-code search engine rankings and does not directly influence search results. Hence, presumably, this system is internally used for A/B testing of search results to assess quality for Google Quality Raters, however, this is simply my very own speculation.
54. Keyword in Top Level Domain
In a 2011 video of keywords in domains, Matt Cutts confirmed that the appearance of the keyword in the top level domain is still a ranking factor, although the significance will decline over time. Hence, it is best to ensure that your domain is memorable, rather than focusing too much on finding a domain that includes the keyword, especially if the keyword is very competitive.
55. Keyword Location in Domain
Currently, a domain that begins with the target keyword still has a little edge over sites that don’t. Again, instead of obsessing with squeezing the keyword into the domain, however, remember that it is more important to choose a memorable domain.
56. Keyword in Subdomain Name
Similarly to the keyword appearing in the top level domain, the keyword’s representation in the subdomain can also slightly impact your website’s ranking.
57. Exact Match Domains (EDM)
After the “Exact Match Domain” (short EDM) update in 2012 (for which the patent was filed in 2003 already), merely having an exact match keyword in the domain will no longer give you an edge if your website is of poor quality. However, if you have a quality website, an EDM can still make a difference.
58. Domain Expiry
A Google patent states that “valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain”. While this could be an indication that domain registration length is an important factor, not every patent indeed gets implemented and Matt Cutts clarified in his … video that it might not be as important as some SEO professionals claim it to be.
At the same time, keep in mind that Google cares deeply about relevancy! Knowing that a website with valuable information will be around for years to come must surely play at least a small part for search engines.
59. Domain Age
Domain age does play a small role in Google ranking. In his 2010 video on the importance of domain age on ranking, Matt Cutts confirms this by stating that: “The difference between a domain that’s six months old versus one year old is really not that big at all.” Surely, the difference between a six months old domain versus one that is six years old, would be larger, all other factors being equal. This, again, has to do with relevancy. If your website has been around for several years, it has proven to be competitive and the chances of it being around tomorrow are higher than the chances of an entirely new domain.
60. Domain History
If a website has the same ownership throughout the years will be seen as more relevant than a site that has what’s called a “volatile ownership”, that is a site where ownership often changes. This data can be checked via WhoIs.
61. Public WhoIs
If a website owner opts for privacy protection on WhoIs, Google may understand this as a sign of “something to hide,” according to Matt Cutts at the 2006 PubCon:
“…When I checked the whois on them, they all had “whois privacy protection service” on them. That’s relatively unusual. …Having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.”
That said, Google never officially confirmed this. A user might want to keep this information private to avoid email spam and the likes. In my opinion, that kind of spam is almost unavoidable, so you might as well be honest with search engines.
That said, Google might not check WhoIs information unless the website has already been flagged.
62. Penalized Domain Owner
If Google identifies a particular person as a spammer, all other sites owned by the same person will likely be flagged. This makes the internet a lot safer and more enjoyable for all of us. Thank you, Google!
63. Country Code Top Level Domain (ccTLD)
The Country Code Top Level Domain (.ca, .de, .fr) helps the site rank for that particular country. At the same time, it might limit the website’s ability to rank well on a global level. To quality for a particular country code extension, such as .ca in Canada, the domain owner often needs to confirm that s/he will adhere to that country’s code of conduct. In the case of a .ca domain, that would be the CIRA.
64. Number of Internal Links and Backlinks
A webpage that is referenced often (either via internal links, that is links from pages within the same site, or from outbound links, that is links from external websites to this page) may be important. That said: Beware of linking schemes and resist the temptation to “buy” links. These might get you blacklisted with Google and other search engines. Building relevant, quality links naturally is the right thing to do.
65. Quality of Internal Links and Backlinks
The amount of links to a particular page is not the only factor that matters. Google also assesses the quality of those links, as indicated in this 2015 video from Matt Cutts on “Quality Links and Link Building”. Links from authoritative websites and webpages are considered a good indicator that the webpage they are pointing to is also of high quality.
66. Number of Outbound Links
Having too many do-follow outbound links may “leak” PageRank and can thus negatively affect rankings.
67. Number of Outbound Links on Referrer Page
PageRank is smart. A link on a page with hundreds of outbound links is not likely representative for quality content and as a result, search engine ranking can be negatively impacted.
68. Outbound Link Quality
Google’s focus on user experience does not stop on your website. Where your outbound links (that is links from your website to another) lead does play a role for your search engine ranking, too.
In 2016, Google heavily focused on identifying linking schemes, paid links and the like and manually reached out to affected sites to inform them on identified violations, consequences, and possible actions.
69. Outbound Link Relevance
To provide the best user experience, Google also analyzes the topics of your outbound links. If your webpage’s content is about cooking and your links point to pages about cars, your webpage’s relevance to search queries on recipes could be impacted.
If your own site contains multiple pages with identical content blocks (e.g. a printer version of your webpage), you can use canonicalization to help Google out.
71. Broken Links
Having too many broken links on a webpage could be a sign of a neglect and thus negatively affect your webpage’s quality rating and search engine ranking. You can easily check for broken links in your Google Search Console.
72. Sitewide Links
In his 2012 video on site-wide backlinks, Matt Cutts has confirmed that these links are “compressed” to count as a single link.
Cloaking is a black-hat SEO practice that consists of presenting different content (or URLs) to users and search engines. When caught (notice that I said “when” not “if”), the site will be penalized, de-indexed, and blacklisted by Google.
74. 301 Redirects
According to a 2013 video on PageRank Matt Cutts says that the PageRank dissipation on each link for a 301 redirect is similar to that of a regular page link.
75. Excessive 301 Redirects
In a 2011 video on 301 redirect limits Matt Cutts clarifies that there are no limits on how many redirects can be used but points out that 301 redirects should be reserved to pages that are truly migrating that site. If the redirect is temporary, a 302 redirect should be used. In other words, redirecting a page with over 10,000 pages to a new location will not hurt your search engine rankings.
76. Diversity of Link Types
Google considers an unnaturally large percentage of your links coming from a single source, such as forums, blog comments, etc., as a possible sign of webspam. A well-balanced sources of links is considered a sign of natural link building.
77. “____ Links”
Terms like “sponsored links” or “link partners” could potentially impact that link’s value in the eyes of Google.
78. Contextual Links
Links embedded in a page’s content are considered more relevant than standalone links that appear out of context.
79. Keyword Appears in Referrer’s Title
As a relevancy indicator, Google considers links on pages that contain your page’s keyword in the title as “Experts linking to experts”.
80. Backlink Anchor Text
Anchor text of backlinks often provide a more accurate description of the referenced webpage than that pages themselves. As such, anchor texts are important signals of relevance.
81. Internal Link Anchor Text
Similarly to backlink anchor text, internal link anchor text is also a relevancy signal.
82. Link Title Attribution
The link title, that is the hover link text, is also an indication of relevancy.
83. Link Location
Due to PageRank dissipation and other factors, links that appear in the beginning of the content carry slightly more weight.
84. Link Placement
Just the same as link location within content matters, link placement on a page matters as well. A link embedded in the page’s content is weighted more heavily than non-contextual links, such as those appearing in the footer or sidebar.
85. Quality of Referrer Content
Links from poorly written content on a referrer website is not regarded as highly as a link from well-written, quality content.
86. Number of Referrer Root Domains
The number of referring domains is an important ranking factor.
87. Number of Links from Separate C-Class IPs
Similarly to the number of different web domains referring to your site, the number of links from separate (class-c) IP addresses indicates your site’s relevance to a wide breadth of websites.
88. Total Number of Linking Pages
In his 2012 video on site-wide backlinks, Matt Cutts confirms that when multiple links from the same domain appear on a page, Google might try to group them together and only count them once.
89. Referrer Domain Age
Backlinks from older referrer domains may have a more positive impact on your ranking than those of new domains. This again, has to do with Google’s desire to provide quality, relevant content, and great user experience.
90. Linking Domain Relevancy
Again, playing into Google’s desire to provide relevant results, a link from a referrer in a similar industry or niche is considered more important than a link from a completely unrelated site.
91. Links from .edu or .gov Domains
In his 2010 video on twitter and Facebook, Matt Cutts debunked yet another common misconception about ranking factors. This time it was related to the often cited SEO myth that links from .edu and .gov domains are more important than those from other top-level domains.
92. Links from “Authority” Pages
Getting links from pages that are considered important resources (e.g. New York Times) or experts on a certain topic, have a positive effect on your ranking.
93. Wikipedia as Referrer
Although Wikipedia links are nofollow, many individuals trust these course pages and it is possible that Google does, too.
94. User-Generated Content Links
Google is able to distinguish links from user-generated content from those by the feather of the actual website owner.
95. Reciprocal Links
One indication of spammy content that Google tackled with its Penguin update, was black-hat linking schemes, such as reciprocal links and link wheels. These indicators have meanwhile made its way into Google’s algorithm. Google’s Link Schemes page gives more information on what to avoid.
96. Forum Profile Links
Over the years, forum profile spamming has caught Google’s attention and as a result, links from forum profiles may be significantly devalued.
97. Guest Posts
Many blog owners and website owners consider guest posting an integral part to white hat SEO. In the eyes of Google, however, links coming from guest posts (especially those embedded in the author’s bio) are considered less valuable than contextual links.
Similarly to the context you require to understand a certain phrase you read, the words that appear around your backlinks also help Google identify what that page is about. For that to happen one condition needs to be met: The page must be publicly accessible.
99. Country Top-Level Domain of Referring Domain
Links from a country-specific top level domain extensions (e.g. .ca, .de, .fr) to your website may help you rank better in that country. However, as always relevancy matters, as Matt Cutts clarifies in his 2012 video on site-wide backlinks.
100. Links from Bad Neighbourhoods
Links to and from “bad neighbourhoods”, such as online pharmacies, porn sites, or payday loan sites, may hurt your search visibility.
101. Affiliate Links
Light use of affiliate links seems to have no direct affect on SEO ranking. Excessive use of affiliate links, on the other hand, Google might check if your site falls into its category of a “thin affiliate site” and might consequently penalize you.
102. Backlink Age
According to a Google patent, older links have more ranking power than newly minted backlinks. Given that Google places a lot of weight on relevancy and fresh content, it is uncertain if this ranking factor is indeed in effect.
At the same time, age is a factor that cannot be changed and thus links that have been around for a while can be a good signal for Google to trust your site.
103. Schema.org Microformats
Pages that support microformats may rank higher than pages without it.
104. DMOZ Listed
Many SEO professionals believe that Google favoured websites listed in the open-content directory DMOZ. However, as of March 2017, this factor is no longer important, as the dmoz.com is no longer available.
105. Natural Link Profile
A site with a “natural” link profile is likely to rank highly, again stressing Google’s requirement for relevancy and user experience.
106. Positive Link Velocity
A site with positive link velocity, that is a site that quickly establishes a solid linking network, will usually be perceived as a quality site, by search engines.
That said, if your link velocity is too high, it can make your site look spammy and actually hurt your website. As long as you “naturally” build links, however, you should be fine.
107. Negative Link Velocity
Hand in hand with the statement about positive link velocity; negative link velocity can have a significant negative impact on your site’s search visibility.
108. Nofollow Links
Google indicates that “in general, we don’t follow them. This means that Google does not transfer PageRank or anchor text across these links. Essentially, using
nofollow causes us to drop the target links from our overall graph of the web. However, the target pages may still appear in our index if other sites link to them without using
nofollow, or if the URLs are submitted to Google in a Sitemap.” The last sentence here is crucial and I’m mentioning it here to clarify this matter for those SEO professionals that have subscribed to the misconception that “in general we don’t follow them” means that occasionally Google does not follow nofollow attributes.
109. Organic Click-Through Rate for a Keyword
Pages with a higher click-through rate (CTR) are an indication for relevance and as a result may get a boost on Search Engine Result Pages (SERP) for that particular keyword. That said, I am often asked if AdWords performance could help with SERP as well and the answer to that question is a definite “no.” Paid traffic is quite different from organic traffic and is treated differently on SERPs.
Also, CTR is obviously influenced by the position of the search result, simply because users are more likely to click on top results. There are likely some algorithmic ways of dealing with this “position bias.”
110. Bounce Rate
As we’ve already discussed above, some SEO professionals argue that Google Analytics data plays into search engine ranking, which Matt Cutts has already clarified is not the case. That’s not the bounce rate we’re talking about here.
There are other ways for Google to determine bounce rate from a website after a click on a link displayed on a search engine results page and hence evaluate the relevancy of that page.
111. Direct Traffic
SEO professionals claim that sites with lots of direct traffic are ranking higher and that Google draws this information from Google Chrome user data. In 2009, Google had confirmed these suspicions but points out that this data is collected from the “opt in feature” PageRank. Since PageRank in the toolbar has since been retired, it is unclear if and how Google is now collecting this data. And in 2012, Matt Cutts confirmed that Google does NOT use data from Google Chrome when evaluating the website. Therefore, it is possible that the data mining from the Google Chrome toolbar has ceased.
112. Repeat Traffic
Google may also take repeat traffic into account when evaluating website relevance and website quality.
113. Blocked Sites
In 2011, Google’s Panda update started incorporating data on sites users block into their website quality analysis and it has since made its way into the algorithm.
114. Number of Comments
User interactions, such as comments received on your blog may be a signal for quality and thus help with search engine rankings.
115. Dwell Time
As mentioned above, Google does determine bounce rate and “dwell time” (time on page), however NOT via Google Analytics data. It has other ways to compute these factors and they are being taken into account when evaluating your website’s quality and relevance.
116. Google Toolbar Data
As previously discussed, Google has collected Chrome browser usage data to evaluate site quality and relevance. This was true for things like page load time, malware, and blocked sites. Some SEO professionals claim that Google also collects data on bookmarked sites (in Chrome) but this has never been officially confirmed.
117. Freshness Indicators
Google’s users love fresh content and the company responds by boosting search engine rankings slightly for those websites that meet that criteria. Webpages that are often updated with new, valuable content, are a sign of freshness and quality.
For ambiguous keywords, Google might show a diverse set of search results including those that might not have normally made the cut, to satisfy user searches.
119. User Browsing History
Sites that a user frequently visits while signed into their Google account might get a more favourable ranking on SERPs for related searches.
120. User Search History
Google’s algorithm is able to read a user’s search intent. For example, if someone searched for “Flatscreen TV” and later for “reviews,” Google can understand that the user is in the market for a Flatscreen TV and might therefore rank review sites for Flatscreen TVs a little boost in the SERPs.
121. Geo Targeting
As mentioned above, Google might boost sites with a local server IP and country-specific top-level domain name extension if this is relevant to the search query.
122. Safe Search
When a user has Safe Search switched on, results that include curse words or adult content won’t appear.
123. Google+ Circles
In an effort to create the best search experience, Google might show authors and sites that a user has added to their Google Plus Circles higher in search results for that particular user.
124. DMCA Complaints
In 2012, Google just announced that, it is violations of the Digital Millennium Copyright Act (DMCA) into account. In 2017, Google rolled out three additional changes to its transparency report feature.
125. Domain Diversity
With the goal of providing choices to users and thus enhancing user experience, Google may aim at diversifying domains on SERP.
126. Transactional Searches
In 2012, Google rolled out Google Compare. Four years later, in March 2016, the company then shut down its comparison tools, such as Google Compare Auto Insurance. However, it looks like two transactional services, namely Google Flights and Google Shopping are here to stay.
127. Google News
Keywords that Google assumes are related to news searches will trigger a Google News pages to show up in the search results.
128. Google Shopping
When Google picks up that a user is in the market for a specific product and the user types in a keywords that is related to this potential transactional search query, Google will display related items on Google Shopping on the search results page. These are, however, NOT organic search results and should not be understood as such. These results are sponsored content drawn from competitive Google AdWords ads and include descriptions and images from the advertisers’ Google Merchant Center accounts (which are feeding that data to Google AdWords for shopping campaigns).
129. Google Image Search
When Google notices that a user query would benefit from image results, it will display related results from Google Image Search. These results are deemed relevant based on their image alt tags, captions, descriptions, or even meaning derived from the image’s contextual content on that page.
130. Fred (2017)
Tackling: Thin affiliate sites and ad-centered content
Fred aimed at identifying websites that violate Google’s webmaster guidelines, such as sites that focus heavily on affiliate or ad revenue while providing thin, low-quality content.
Some refer to a smaller minor update to Fred in December 2017 as Maccabees updates, although this was just a small, minor update, not worth receiving its own update name, according to Google.
131. Possum (2016)
Tackling: Location-based search results
The Possum update focused heavily on location-based relevance. The closer a user is to a particular business, the higher that business will show in local search results.
132. RankBrain (2015)
Tackling: Poor content and poor UX
RankBrain is a machine-learning technology that’s part of part of Google’s Hummingbird algorithm, tackling poor content and poor user experience.
133. Mobilegeddon (2015)
Tackling: Poor mobile usability
Mobilegeddon aimed at enhancing great user experience for mobile. Since this update, pages not optimized for mobile are negatively affected in search visibility.
134. Pigeon (2014)
Tackling: Poor quality results for local searches
Pigeon aimed at improving the results of searches where the user’s location is important for search results.
135. Hummingbird (2013)
Tackling: Keyword stuffing
It aimed at helping Google understand the intend behind search queries and matching it with websites that will best match the user’s query, even if that webpage does not contain the actual keywords the user had entered. Factors of importance are latent semantic indexing (LSI), and co-occurring terms.
136. Penguin (2012)
Tackling: Irrelevant and spammy links
Google’s Penguin update tackled irrelevant and spammy links that seemed to manipulate ranking. It also fought linking schemes.
137. Panda 2011
Tackling: Poor content (e.g. duplicate, plagiarized or thin content)
Since the Panda update, webpages have received a “quality score” based on their content, which is then used as ranking factor.
138. Caffeine 2010
Tackling: Overhaul of Google’s web indexing system
With Caffeine, crawlers gained the capability crawl and index pages in seconds, leading to superior information on “freshness” of content across the web.
139. Vince (2009)
Tackling: Brand mentions
With this update, Google was giving “big brands” a little boost in the search results for short-tail keywords.
Web Spam Flags
140. Popups or Distracting Ads
As indicated in the official Google Rater Guidelines Document, popups and distracting ads are signs of a low-quality website.
Webpages that indicate “over-optimization” of on-page content, e.g. via keyword stuffing, header tag stuffing, excessive keyword decoration and the likes, are a sign of poor-quality websites.
142. Ads Above the Fold
The 2012 Page Layout Algorithm penalizes websites that excessively greet users with ads above the fold.
143. Number of Google AdSense slots on the page
An excessive amount of Google AdSense ad slots on a page might signal poor page quality and hence affect search engine ranking, especially if these ad slots are above the fold, as just discussed.
144. Affiliate Sites
Countering black-hat SEO techniques such as “cloaking”, Google has rolled out various updates, most notably the major Fred Update in 2017, to shut down thin affiliate sites.
145. Hiding Affiliate Links
Further to the previous point, Google also penalizes sites that are attempting to hide these affiliate links.
146. Auto-generated Content
As indicated in its Google Search Console Quality Guidelines, Google will penalize and de-index automatically generated (“auto-generated”) content.
147. Excess PageRank Sculpting
PageRank sculpting, which is the practice of adding nofollow attributes to all outbound links or most internal links excessively, may be a sign of gaming the system. As a result, Google is looking for suspicious activity that fall within that definition and penalizes violators.
148. Spam Sites on Website Server
If your server’s IP address is flagged for spam, it may hurt all of the sites on that server. In his 2013 video, on the effect of this hosting issue on site rankings, Matt Cutts indicates that for most part a spammy domain that’s hosted on the same server as your site should not affect your ranking. However, there are exceptions. If the number of spammy sites on that host exceed the “norm”, your site might potentially be affected. It pays off to check what other pages are hosted on the same server, which can easily be done through a Reverse IP Domain Check.
149. Meta Tag Spamming
Keyword stuffing is also happen in meta tags. This is one of the reason why Google no longer uses keyword meta data as a ranking factor. If Google thinks that a website owner is adding keywords to the site’s meta tags (e.g. title, description, and image meta tags), the company will face penalties.
150. Unnatural Link Acquisition Speed
A sudden acquisition of a large amount of links can a very good indicator for phony link building schemes. A site that is participating in such a scheme can sometimes be called “Google Sandbox.”
151. High Percentage of Low Quality Links
A high number of low quality links can be an indication for black hat SEO practices, a technique that Google penalizes in its search engine rankings.
152. Linking Domain Relevancy
Sites with an unnaturally high number of links from unrelated sites are penalized by Google.
153. Unnatural Links Warning
In 2012, Google sent out thousands of “Google Webmaster Tools notice of detected unnatural links” messages alerting website owners about suspicious links that needed to be cleaned up. Since, websites that did not follow through and did not remove those suspicious linking schemes are still being penalized in Google.
154. Links from the Same C-Class IP
As mentioned above, large amounts of links coming from sites on the same server IP may be a sign of a linking scheme that Google penalizes.
155. “Poison” Anchor Text
“Poisonous” anchor texts, such as certain pharmacy keywords, can be a sign of spam or a hacked site and as a result negatively affect the website’s ranking in search engine results.
156. Temporary Link Schemes
Likely through the use of archived data, Google is now able to identify when a website owner creates and then quickly removes spammy links.
157. Selling Links
Selling links is considered a serious offence and will hurt search visibility.
158. Manual Review
Google regularly performs manual reviews of websites for quality control and might hand out manual penalties in cases where a website might be violating Google’s policies and regulations.
159. Disavow Tool
If a website notices that a competitor is engaging in negative SEO, trying to hurt the reputation of the website in question, the website owner can submit a request for manual removal of suspicious links via Google’s Disavow Tool in the Google Search Console.
160. Reconsideration Request
A website that feels it was wrongfully penalized by Google or has since removed the violating content can submit a reconsideration request to Google.
161. Branded Searches
When a user is searching for a brand on Google (ie. “Apple MacBook Air”, Google takes this into consideration when returning search results.
162. Brand Name in Anchor Text
A branded anchor text is a strong signal for relevance in brand-related searches.
163. Big Brands
In 2009, Googled rolled out a major update, called Vince. Since then, Google may be giving big brands a small boost for certain short-tail searches.
164. Single Site Results for Brands
Because Google aims to provide relevant search results, domain or brand-oriented keywords can appear multiple times on a SERP, if the results are indeed relevant to the search, such as a branded search. Google does not, however, favour brand mentions over any other search result, as clarified by Matt Cutts in his 2009 video on branding weight in search results.
165. Brand Mentions on News Sites
Large brands that are getting a significant amount of news coverage (e.g. Tesla), receive a higher amount of mentions on Google News.
Companies and brands often get mentioned with positive and negative sentiments even without hyperlink. Google might looks at non-hyperlinked brand mentions as well, although this is unconfirmed. If a business receives a lot of positive brand mentions that are not pointing to the website, one way to catch those is via Google Alerts. One link-building strategy for the business in question could be to reach out to websites that are mentioning the brand and ask them for a hyperlink.
167. Social Signals
Google might be using relevancy information from the account sharing the content and the text surrounding the link.
168. Site-Wide Social Signals
Site-wide social signals may increase a site’s overall authority.
169. Legitimacy of Social Media Accounts
Clout signals, such as follower count and number of posts might play a role in determining the authority.
170. Social Shares of Referring Page
The number of social shares a page receives may influence the link’s value.
171. Google Verified Local Listing
Brick-and-mortar businesses can verify their business account with Google. Verified businesses might rank higher in location-based search results and on Google Maps.
172. Google+ User Accounts
Google might consider +1s coming from accounts in its search engine rankings to determine page authority.
173. Links from Facebook Pages
Google has confirmed that it is looking at links coming from social media, as long as the anchor text is accessible. In other words, links from Facebook Pages might help with search engine rankings, as long as they are public.
174. Authority of Facebook User Accounts
When a Facebook Page has a lot of followers, it might receive a lot of shares and if those shares happen to include a link to the homepage of that business, it might help boost the page’s authority in search engine rankings. As mentioned earlier, when those links are hidden behind a wall of security, they might not play a role in search engine rankings after all.
175. Twitter Presence
Having a twitter presence could be an indication of relevance and quality, however Google has not officially confirmed this.
176. Number of Tweets
Some SEO professionals argue that the amount of tweets a page posts can have an influence its rank in Google Search results. However, twitter links are usually “nofollow” links, so this does not seem to be a situation of “causation” but rather “correlation.”
177. Authority of Twitter Profile
A twitter profiles with high clout score could be a signal of brand authority. Whether or not this is indeed playing a part in search engine ranking has yet to be confirmed.
178. Official LinkedIn Company Page
Businesses that have a LinkedIn page could potentially be rewarded with a slight boost in search engine rankings.
179. Employees Listed at LinkedIn
Having employees connected to a business’ official LinkedIn page could potentially be a brand signal. If this has any affect on search engine rankings remains to be seen.
180. Pinterest Presence
Since 2012, Pinterest links have adapted the “nofollow” attribute. As a result, for SEO reasons, Pinterest might no longer be as attractive as it used to be. However, it is possible that similarly to other social media accounts, having a Pinterest profile with popular pins might be a brand signal that could potentially weigh into Google’s ranking.
181. YouTube Presence
Anyone who has ever looked up lyrics to a song has noticed that results from the world’s second-largest search engine, YouTube, are given preferential treatment in SERPs. This might or might not have something to do with the fact that YouTube is owned by Google. However, the key, as always, is relevancy.
182. Votes on Social Sharing Sites
Google might use votes from shares sites such as Reddit, StumbleUpon, and Digg as social signal of relevancy.
183. Number of RSS Subscribers
Google owns the Feedburner RSS service, and it is possible that they might consider RSS Subscriber data as a signal. This has, however, not been confirmed by Google and given that they also own Google Analytics but do not use that data for their search engine results, it remains to be seen if the Feedburner influence on search rankings in a myth.
184. Mobile Responsiveness
Especially since its Mobilegeddon Update in 2015 it is no longer a secret that Google heavily cares about mobile experience. Having a responsive site is therefore important for ranking well, especially for searches performed on mobile.
Some webmasters previously blocked CSS, JS, and images for mobile sites because they were not supported. Nowadays, they are supported in most cases and Google’s crawler wants to access these items for indexing.
186. Use of Interstitials
On a small screen, a full-screen pop-up often results in a poor user experience. As already discussed, a poor user experience means poor search engine ranking for your (mobile) site.
187. Flash usage
Most mobile browsers do not render Flash. For special effects, use HTML5 instead.
188. Viewport Configuration
The viewport meta tag helps browsers with scaling the page to suit a specific device. Configuring it adequately will lead to an enhanced user experience.
189. Responsive Design Instead of Fixed-Width Viewport
A fixed-width viewport is not a good practice for mobile-optimized sites. I would recommend responsive design instead.
190. Content sized to viewport
Here the page content does not fit the window, and a user has to scroll. This can be fixed with relative rather than fixed widths.
191. Font size
Scaling issues, such as font sizes that are too small to read without zooming in, again affect user experience negatively. As we’ve seen in the past, a poor user experience is bad for SEO.
192. Spacing of touch elements
Touch screen navigation can lead to accidental clicks when neighbouring buttons are too big, to close together, or in the user’s scroll path. Often accidental clicks can lead a user to abandon the site, resulting in poor website performance and poor user experience, both factors that play a role in search engine ranking.
193. Optimize titles and meta descriptions
Remember that you’re working with less screen space when a user searches using a mobile device. To show off your best work in SERPS, be as concise as possible (without sacrificing the quality of the information) when creating titles, URLs, and meta descriptions.
194. Fast Re-Directs to Mobile Page
When a re-direct from a desktop website to a mobile website is necessary (e.g. in cases where the URL starts with “m.” or “mobile”), this re-direct should occur as fast as possible. Otherwise it hurts user experience and might affect your search rankings.
195. Fast Page Load Time of Mobile Site
In cases where no redirect is needed because the site itself is “responsive” and automatically adjusts to screen size, it can be tempting to build the site with desktop in mind. Often, these sites are quite “heavy” and as a result load slowly on mobile devices. For a mobile visitor on 3G, a single redirect may cost up to half a second, costing between 3-10% of your conversions.
196. Data Heaviness
Keeping the mobile website as “light” as possible in terms of size is important for fast load time and great user experience. A user that is loading a data heavy site might also use up a lot of their data and might think twice before visiting that site again on their mobile phone.
197. Site Search Visibility
Making site search visible is important for mobile sites, where navigation can sometimes be an obstacle to great user experience.
198. Simplify Form Entries
Specifically for mobile sites very navigation and interaction are more difficult than on a desktop computer, form entries need to be simplified. This can be done by streamlining entries and opting for easy input methods.
199. Expandable Product Images
Making product images expandable is a vote for user experience.
200. Single Browser Window
Especially for mobile-optimized sites, it is essential that links do not open in separate windows but rather that the user is kept in a single browser window for as many interactions as possible.