Search Engine Ranking:
Google SEO factors 2018
Even though SEO is a synergy of different practices
, not all are equally important to achieve a good ranking. Since it takes time and continuous testing, it is important to prioritize the efforts on the factors that offer greater benefits.
In this list we tried to provide an analytical overview of all optimization factors on Google search engine.
Our intent is to make our knowledge and the results of our tests available, in order to create a dynamic information manual, continuously updated.
SEO factors: domain level
The age of a domain, ie the time that has elapsed since it was recorded for the first time by search engine, is not a very important factor in SEO
. But it is often confused with other factors, in which the time variable is crucial to gain relevance.
- Suppose you have a site with valuable content, it takes some time for the same content to be found and linked obtaining natural backlinks. But that is not related to domain age, but to the time elapsed from the first time that those pages have been indexed by the search engine.
Keywords in the top-level domain
Using the main keywords associated with your business in the top-level domain is certainly to be considered an index of relevance
for the search engines. At the same time, abuse of keywords in the domain is a negative factor (the famous keyword-rich domains
), as well as the inclusion of non-conventional hyphens and glyphs (particularly penalized on Bing).
- • Do not use hyphens, and in any case NEVER more than one
- • Do not use keyword-rich domains
- • Try to have a maximum of two consecutive relevant words
- • It is advantageous that the keyword is at the beginning of the domain
Years of advance registration
Domains that are used for operations of spamdexing
, black hat SEO
are unlikely registered for more years in advance. That is why search engines tend to use this data as a factor in establishing the legitimacy of a domain.
Keyword in subdomain
The sub-domains, unlike the subfolders, are attractive options for different circumstances (eg. segmentation of local target), since they are considered by search engines as independent sites
. They can be an effective strategy to generate more targeted traffic. If improperly used, however, subdomains can create more harm than benefits.
- • Avoid duplicated content
Time consistency is crucial in SEO. A domain that does not have an constant editorial line and constant whois data over time is likely heavily penalized in the ranking.
Exact match keyword domain
In the past it was enough to own a domain whose name corresponded exactly to a keyword or group of keywords in order to easily position in the SERPs, despite low-quality content. Since the introduction of EMD penalty
, however, Google’s algorithm
will give more importance to the quality of the content rather than to factors such as the domain name.
WhoIs: public vs. private
Despite the risks related to the security of their data (the whois data are open to public), “it seems” that using security services is a negative factor, even though it is a negligible factor.
While the IP of the machine has no SEO relevance, the WhoIs data do hve. Google in particular has recently confirmed that makes use of WhoIs data to contrast malicious behavior, penalizing all the domains registered by the same person or company.
A second level geographical domain (.com, .us, .de) helps search engines to determine the geographic target, critical to ranking in a specific nation. Understandably this limits the chances that the site can be positioned optimally globally.
The site of our agency is indexed globally (English) with Simpliza.com and with geographic TLD in Italy (Italian language) with the simpliza.it address. They are totally independent from the code point of view and resources, but running on the same machine hosted in the United States (this parameter is completely indifferent to the search engines).
SEO factors at page level
Keywords in the title tag
The title tag is one of the fundamental factors in Search Engine Optimization and it is used by the browser
, by search engines to populate the SERP
and by third parties thanks to the scraping
- • Never miss the H1 tag in the page
- • Be careful with the limits set by the search engines
- • A good title is worth a lot, almost as much as the content of the page themself
Keyword at the beginning of the tag title
The ideal title tag format should be:
+ Secondary keyword
+ Brand Name
This is the best practice, of course. The variables are numerous and a good experience helps in finding from time to time the most effective formula.
Keyword in the meta description tag
Another fundamental signal is the use of keywords or group of keywords in the description tag. It is also not appropriate to faithfully replicate the title tag. It is important to strike the right balance between functional SEO copy and conceptual copy (user oriented).
Keyword usage frequency (Keyword Density)
The keyword density has a huge impact in ranking on search engines. This is not only relevant for the major keywords, but also for the less important ones, .The search engine uses these words to determine the value of the combination of the individual words and to determine the topic of the page.
- • A high density is penalizing
- • Often SEO plugin for CMS are misleading (also true for the contents length)
It is well known that the length of an article or a page plays a decisive role in the SERP. But what is the ideal length? There is no precise answer, but comparing a large number of in-depth articles you can derive the average length of the articles in the top positions of Google.
- • 1st position: average of 2460 words
- • 2nd position: average of 2480 words
- • 3th position: average of 2430 words
- • 4th position: average of 2400 words
- • 5th position: average of 2310 words
- • 10th position: average of 2050 words
LSI keywords in the content
If the editorial line provides quality content this factor is somewhat irrelevant, but we have to mention it in this list. Use keywords LSI (Latent Semantic Indexing) simplifies the interpretation of the argument contained in the Treaty, allowing search engines easier positioning of the page.
LSI keywords in the title
The use of Latent Semantic Indexing keywords in title and description meta tags is less obvious. It is also essential if the major search keywords have no semantic uniqueness.
Google, Bing, Yandex and Baidu in particular consider the loading speed of a page crucial in determining the rankings.
- • Monitoring loading speed
- • Use if necessary CDN (Content Delivery Network)
“The 25-30% of all content on the web is duplicate content” explains Matt Cutts (head of Google’s Webspam team). Over the years, the scraping algorithms of search engines have refined their ability to detect duplicate content, which are and an enemy to fight. Duplicate content are not, however, necessarily fraudulent. Often they arise from a bad configuration of multilingual sites, the pagination (rel = “next” and rel = “prev”), dynamic variables, robots files and migration.
- • Use the rel=canonical tag for the dynamic queries (Google also supports cross-domain)
- • Always communicate to search engine changes in the url through .htaccess
The images are a great resource for communicating to the search engine a set of relevant data as relevant signals, through the file name, alt text, title, description and caption.
- • Never forget the alternative tag
- • The image search is an excellent resource, and you can design entire strategies on this
Date of last update
Since the search algorithm update “Caffeine” and in particular the recent “Hummingbird”, Google favors updated content, giving prominence to topics identified as events.
It is unclear how the algorithm is changing as regards Google News, but it seems that the two types of content are approaching one another. The Google News content are gradually losing the privileges of authority to which we are accustomed. We’ll see how they evolve.
SEO factor on site level
Uniqueness and quality of the contents
Today search engines tend to fight an excellent marketing strategy, often poorly structured: the review posts
, the guest blogging
, the affiliate programs
This is because marketers tends to give little attention to the uniqueness of the content. Marketing activities of this type are risky, but if done well gives very satisfactory results.
Content update rate
You might assume that regularly updating a website involves a consequential improvement in the ranking. Actually, updating content stimulates the crawler to re-analyze the site and the new links, but what really matters is the quality factor
, not the quantitative factor
Amount of indexed pages
The amount of pages (and multimedia content: images, pdf, videos) are an authority signal? It is good to consider, that the ranking is not directly tied to the number of pages and content
The number of pages increases the link juice, increases the opportunities for indexing for different keywords and gain backlinks. Factors which in turn may allow a better ranking in search engines.
Presence of sitemap
Make good use of the sitemap is vital to the present day, especially if your site is new. So much so that almost all search engines (Google, Bing, Yandex, Baidu, etc ..) have integrated into their platform a “sitemap submission tool”, to communicate and validate the address. The advantages are numerous, while the risks of incurring penalties are low or zero.
If the site is inaccessible for extended periods of time or with excessive frequency we risk much, and we must avoid it at all costs. Whatever the reasons is (server downtime
), the risks that we face can range from the loss of some positions in the ranking to the total removal of the entire site from the Index. You can, however, avoid the worst by taking the necessary precautions.
- Let us assume that the site is down for maintenance, we have to tell the search engine that the service is temporarily unavailable. In this case we use the HTTP status code 503, in the absence of which would return an endless list of 404 errors, which is bad for SEO. It is essential to be very careful and anticipate every situation that may violate the placement, even thought the events are scheduled or not.
Unlike what one often written on the subject, the geographical server location does NOT affect SEO
, not directly at least. Rather, it is the ping speed which is an important factor, in particular for the download of multimedia content, for which often a CDN services is used.
The exception is the Chinese search engine Baidu
, wich gives much value to both the geolocation of the IP that the extension domain, preferring .cn TLD and host China (including Hong Kong). [Read more: Complete Baidu SEO factors list
Google announced a few years ago that the SSL certificate
has been finally introduced as SEO factor [3 – HTTPS as rankink signal]
in order to encourage webmasters to use the new web security standard. At the moment this actor is not very decisive in SEO (it is rather from other points of view, eg. Email spam).
Duplicated meta data
A duplicate meta description is a clear signal of activity identified as spam. Provocatively, Matt Cutts explains that it is preferable not to include any meta description rather than enter a duplicate.
The breadcrumb is an indicator of the levels of the page position in the site hierarchy. It can be communicated to google through Sitelinks searchbox
, which allows it to be visible in the SERP[4 – Better presentation of URLs in search results]
. It is not a relevant factor, but it definitely makes the search more user-friendly.
Starting by April 2015, the mobile optimization has become a relevant factor. There are essentially two ways to avoid penalties: make the site responsive or create an autonomous mobile version.
Intrusive Interstitials Mobile Penalty
Starting January 10th Google will penalize mobile pages with interstitials that impede user access to content: advertising pop-ups in lightbox ed standalone interstitials requiring a manual action to access content.
The techniques that, used responsibly, would not be affected by the new signal are the cookye law pop-up, interstitials for age verification, banner for app download advertising (provided they do not occupy too much screen area).
SEO factors: backlinks
Although we tend to always give less importance to backlinks, they are still the major ranking factors. It is also true that the diversity of links is more important than quantity and that a root backlink is more valuable than an internal page one (particularly for Bing).
Number of backlinks from different IP
The IP of the machine is detected, as we have seen before, from all search engines. It is also used to give a value to incoming backlinks, which will be low if the same server ip hosts more sites from which conveys backlinks.
Telling search engines not to “follow” a link may be helpful. But not for the reason for which it was created (by Google in 2005, is not a standard).
Actually both Google and Yahoo follow the links, but, like Bing, Ask, Baidu and Yandex, they do not use it for ranking. Yahoo even indexes the linked page.
The contextual links are an excellent resource
because the same context ensures trust and credibility by the reader and because the search engine analyzes keyword density and other variables to determine their value. To counter the guest posting, however, the search engines give importance to the context of the site from which the backlink.
Backlink anchor text
Not like it used to be, but the anchor text is used to determine the conceptual relevance of the context from which the backlink comes from.
SEO factors at the user interactions level
Organic CTR for individual keyword
Fundamental data in SEM field, the CTR (Click Through Rate)
in the organic results is often underestimated. We tend to spend more time on on-page SEO, social networks (SMM) and on the backlinks than on maximizing the potential clicks that can be generated. It seems that Google (in particular) use it as ranking factor. Rather, it uses the impression data and user side metrics such as the bounce rate.
If direct traffic affects the ranking of search engines or not it has always been a much debated topic. The impression is that direct traffic is not a SEO factor
, and the fact that these data often are collected anyway sobering.
Assuming that you have installed Google Analitics, Adsense, Tags, Site Auth, Baidu Analytics, Metrics, or snippeti similar that they can communicate the statistics of the site, the search engine can not intercept a visit if this comes from:
- • Bookmarks
- • Direct traffic (address on the navigation bar)
- • Traffic from social referrals
- • Traffic from other search engines
- • Referral email (not related to the search engine like Gmail)
These are all excellent sources of traffic, so it is reasonable to think that search engines do not use a this factor in organic search rankings.
The “dwell time” is a combination of three components:
- • Session duration
- • Bounce rate
- • CTR (Click Through Rate)
It is not clear if also the output rate (exit rate), the page depth and the returning rate affects the Dwell time, certainly we know that it has an important influence in SEO.
SEO factors at social network level
Number of tweets
The number of links conveyed through tweets on Twitter is difficult to assess accurately the effects as “SEO signal booster
” because there are many variables that interfere.
From a tests performed on sample it is clear, however, that could be considered one of the Google SEO factors, offering particular advantages of low numbers (1-50) and very high numbers (over 500)
Number of likes on Facebook
There is a correlation but no causal relationship
between the number of like a Facebook page and its ranking in the search engines. Remains that they are an excellent resource to generate and vehicular traffic.
Company page on Linkedin
Today if you manage a company, creating a page on Linkedin is a must. It offers no follow link, but easily gets high rankings and a good opportunity for co-citation.
Last update: 02/01/2018