Contact Us
Scroll Down
//Things Are Not Ranking Factors for Google

Things Are Not Ranking Factors for Google

Things Are Not Ranking Factors for Google is one the thing I wanted to address some common misconceptions about what is and is not a ranking factor in the wake of the recent Twitter and SEO forum disputes regarding ranking variables.

Numerous items are connected to, correlated with, or connected to ranking factors but are not (or most likely are not) ranking factors themselves.

Why do we think that Google’s algorithm could take some of these non-factors into account?

You’ll find some of the most prevalent issues raised by other SEO experts or clients in this page. I’ve made an effort to clarify why they aren’t strictly a ranking component and, when appropriate, have added comments from Google employees.

Things Are Not Ranking Factors

Things Are Not Ranking Factors for Google

Are things like social signals, accessibility, XML sitemaps, content length, and other things really ranking considerations for Google Search? What you should know is as follows.

1. Site’s Age

Although Google has said that this is not a ranking component, I continue to see it listed in every list of ranking factors available.

Despite the fact that these factors are associated, correlation does not imply causality.

Long-running domains have had more time to accumulate all the signals that are used to determine ranking.

Your website has more information and links if it is older since it has had more time to gain users, good word of mouth, etc.

Age has no role in this. The other signs are those that occur with aging but can be detected at any age.

2. Period of Domain Registration

The length of domain registration is equivalent. This is a product you purchase. Making it a ranking element would be pointless if you could just pay for it.

Users don’t care how long your domain has been registered. It has no bearing on how relevant your website is to their search.

Is there a connection? Yes, as spammers frequently refuse to pay for numerous years of registration.

Who else doesn’t pay for several years, you ask? Small firms or organizations do not want to incur the cost all at once.

It’s not really a problem to go annually since registrars provide auto-renew capabilities. It’s beneficial for tax purposes as well if you own hundreds or thousands of domains.

There are more accurate techniques to establish authority.

Although Google has a patent on the use of registration duration, this does not necessarily entail that they are doing it for ranking purposes. Patents don’t operate that way. Nothing may be patented by anybody.

Consider the Time Machine Patent as an illustration. A method’s patent does not necessarily imply that employing it led to an improvement in the situation.

3. Pogo-Sticking

Let’s first define the words. A user who visits one page and leaves without taking any further action is said to have a high bounce rate.

Pogo-sticking is the practice of browsing a page and then instantly returning to the search results (often clicking another search result). Despite Google claiming differently in a video, SEO professionals frequently note this as a ranking component.

It doesn’t matter.

However, (apart from personalization) it doesn’t seem to be a factor in the main algorithm. It may be used for internal testing, comparing ranking adjustments against each other, quality control, and other things.

There are several situations in which pogo-sticking is beneficial. Every morning when I read a few articles from Google and pogo-stick for “Detroit Red Wings” news.

For every click-based measure, the same holds true. They may be readily manipulated, make a lot of noise, and frequently don’t imply what we believe they do.

This is not to say that Google doesn’t compare two different search results pages using techniques like pogo-sticking. But it’s unlikely that they apply it at the site or URL level.

4. Word Count is the Total Amount of Content on a Page

This one is simply absurd. Certainly, having more valuable stuff is preferable.

Better material is more comprehensive. Better content is more pertinent material.

However, just more content? Nope. Be a user when thinking.

Would I prefer the one that simply states, “Detroit’s area code is 313,” or the one that uses lovely language to build to the solution over the course of three thousand words?

In case you were curious, the frequency of content changes doesn’t matter either (in non-news search).

If I’m looking for a recipe for chicken soup, I don’t need to know everything about Grandma’s life; just tell me what I need and how to prepare it.

5. Direct visits to websites. Length of Stay, Bounce Rate. 

Each of these is not a factor.

Only 54% of websites utilize Google Analytics, according to W3techs. Instead, Adobe Analytics is used by most Fortune 500 companies and large brands. Depending on the source you use, Chrome only has a 45–60% market share.

In other words, Google lacks a credible method for obtaining these statistics for more than 50% of the web.

Rankings are dominated by large businesses, yet Google lacks access to their analytics information. Even if they did, the signal is just too loud.

The bounce rate is acceptable for many websites. Consider a weather website; the majority of people only check the weather in one place. A bounce is expected.

6. AMP

Not a deciding element. While page speed is considered a ranking criteria, AMP is not the same as page speed.

Page speed is only a small ranking impact for many searches. Google will never give a quicker page a higher ranking than a page with more relevant content.

The phrase “I know I looked for Pepsi, but this Coke page is so much faster” won’t be heard from a user.

Does AMP speed up pages? It does, really. However, speed—not AMP remains the ranking criteria.

Note: The carousel requires AMP, which does rank #1, but AMP is not factored into the ranking system. That doesn’t count because it is a search tool.

7. LSI Keywords

One of the many false information trends in SEO that periodically surfaces is this one. It just indicates that the person making the statement has no comprehension of LSI at all

Since the L stands for latent, this term should not be used because latent signifies not present, contrary to how most SEO experts use it thereafter.

This article on the subject explains it far better than I can.

8. TF-IDF Keywords

Once more, this is merely an SEO expert accusing the rest of the community of ignorance on computer science. Although TF-IDF is a notion in information retrieval, ranking rarely uses it.

In addition, there are currently many better alternatives to employing TF-IDF. It doesn’t function nearly as effectively as contemporary techniques, and it has nothing to do with ranking at all.

You as a webmaster are unable to do analysis using TF-IDF at the page level. It relies on the index’s corpus of results.

You would require not only the non-relevant documents to compare them against, but also all the other pertinent documents.

It is unrealistic to expect to learn anything by scraping the search results (relevant ones only) and then using TF-IDF. The second half of the information needed for the computation is absent.

Here is a very basic introduction. Pick up an information retrieval textbook and read about these ideas if you want to understand more.

I advise reading Stefan Butcher’s book, “Information Retrieval,” who works with Google.

9. Quality Raters and E-A-T

They have no impact at all on your website. They are not expressly ranking your website in any manner that the system takes into account.

They produce (for want of a better phrase) training sets of data and aid in comparing algorithm improvements.

In essence, Google will send some algorithm updates to quality raters first to evaluate if they have achieved their goals. They may, for example, compare two search results pages and “rank” which is superior in terms of that question.

They’ll think about making the modification live if it passes.

I am aware since I formerly evaluated products for quality. Nothing I did at my employer required me to influence how well any one website ranked.

Additionally, just because something is listed in the quality rater standards doesn’t always indicate that it will affect rankings. The quality rater guidelines offer a condensed explanation of what all the real variables are attempting to measure in simple terms.

E-A-T is a prime example. There is no such thing as an E-A-T score, according to Google.

For humans, EAT just serves as a conceptual representation of what the algorithm is attempting to imitate.

(If you want my view, I still think that PageRank is the best indicator of E-A-T, but that’s another subject.)

11. XML Sitemaps

Every SEO audit I come across has “no XML sitemap,” which is my pet dislike. I simply wrote about it, for real.

Rankings have nothing to do with XML sitemaps. At all. They are a way for Google to find your pages, but if it has already indexed all of them, adding an XML sitemap will be useless.

Not all websites require one. Although it won’t harm, it won’t help if your codebase and taxonomy are strong.

They serve as a sort of band-aid for websites with crawl problems.

In addition, John Mueller claims that HTML sitemaps are not ranking criteria if you really want to go down this rabbit hole.

Do you still need to create an XML sitemap?

Probably. There are several advantages besides improved rankings, such as more Search Console data.

12. Accessibility

Does accessibility matter? It is, indeed.

Does the search algorithm include an indicator to indicate if a site is accessible? Not at all, no.

Accessibility does not yet influence rankings.

Ranking variables like alt attributes, appropriate heading usage, etc. are only a few elements that are necessary for accessibility. However, those are the aspects that search engines are evaluating, not whether or not your page passes an accessibility audit.

Despite this, you should still make your page accessible. Not doing so will likely result in legal action.

13. Accuracy of the Text

It’s difficult to present correct material, despite Google and Bing’s best efforts.

Google and Bing are more aware of what the general opinion on the web says than they are of what is factual. Sometimes the web isn’t accurate.

But more crucially, the search engines are attempting to match query intent and determine authority using other signals (cough, cough, links!).

Right now, the issue is not whether the data is accurate or not (as this is very hard to do). It depends more on whether or not the website demonstrates that it is credible and authoritative. Danny Sullivan says the same thing here.

Search engines aren’t actually assessing “correctness,” but rather popularity or online consensus because they can only observe what the majority of people think. This explains why inaccurate information is constantly present in the knowledge network.

It’s also sort of how Google Translate functions, which explains why certain instances of gender prejudice and other problems show up there.

Unfortunately, that is the style in which the majority of online material is written.

  1. Social Indicators

Matt Cutts informed us that Google does not use social signals back in 2010. (With the exception of the time they really made use of their own Google+ signals.)

Google doesn’t use any social network-specific analytics, such as friend or follower counts.

Not at all.

The majority of social networks prevent crawling by them. Many individuals have private settings on their accounts.

They can’t just get to a lot of that information.

But let’s presume they can. What would happen if they were utilizing it and a robots.txt blocker appeared from Twitter?

Overnight, the ranks would shift significantly.

That isn’t what Google wants. They are all focused on achieving robustness and scalability.

However, they do crawl social networks when and when they can, and they probably treat them the same as any other web page.

A portion of that authority may pass if you have a social media profile with a high PR that contains connections to other content.

I’ve jokingly said for years that I want to build a search engine that exclusively considers social signals.

But picture how embarrassing it would be to look for private medical data and only get jokey memes about the issue.

The way individuals share information on social media differs from how they search for numerous topics.

You can understand why social signals aren’t the greatest metrics for Google to utilize if you just consider the results a search engine that only considered social shares would return for your favorite or least favorite politician.

Read Also!!!

14. Subdirectories or Subdomains

Google is unconcerned.

They could have done so in the past. However, search engines are now far more adept at figuring out whether you’re utilizing a subdomain as a standalone website or as a component of your main website and treating it accordingly.

When it comes to subdomains vs. directories, it’s not about the domain or directory itself, but rather how you use them and connect them to everything else.

Yes, I am aware that there are several studies out there that claim switching from one to the other results in a decline.

But in each of those experiments, the navigation, user experience, and connecting structure were also altered.

Naturally, substituting a lot of links to subpages with a single link to a subdirectory will affect your SEO. However, links and PageRank, not the actual URL structure, are what are to blame.

16. Links Not Found

Here, certain Google comments were misunderstood by SEO experts.

Google has confirmed that unlinked mentions are not treated as links. Even though Eric and Mark took a test, the results indicated no change in scores.

What is most likely taking place in this case is that unlinked mentions are used indirectly for ranking but directly for the knowledge graph and identifying entities.

Do rankings change as a result of the knowledge graph?

Undoubtedly, and in a variety of indirect ways, but we should mention those as a cause rather than the things that could contribute to some degree to them.

Top Ranking Factors You Should not Ignore

Most Google ranking criteria lists are far too extensive. Instead of concentrating on the important factors, they list all the possible ones.

Even worse, the majority contain a lot of misconceptions because no one is familiar with them all.

So today, we’re going to try something new. Instead of going over the 200+ ranking elements, we’ll focus on the top 10, in our opinion.

In no particular sequence, here they are:

1. Backlinks

Possibly the most significant ranking component is backlinks.

How are we aware? PageRank, which is the cornerstone of Google’s ranking system, is based on backlinks. In addition, Google’s Gary Illyes affirmed that PageRank is still in use in 2017 before you claim it is outdated:

Independent studies, such as our analysis of more than a billion web pages, support the association between backlinks and organic traffic:

But not every backlink is made equal.

A backlink’s effectiveness is influenced by several variables, but the two most crucial are relevancy and authority.

  • Relevance

Think about searching your city for the greatest Italian eatery. You seek advice from two pals. One works as a chef, the other as a veterinarian. Whose opinion do you respect?

Most likely the chef, who is familiar with Italian food.

It would be the reverse if you were asking for advice on dog food.

On the internet, the same concept is used. The most valuable links come from websites and pages that are relevant.

  • Authority

Strong pages on powerful websites tend to have backlinks that have the biggest impact.

By examining the Domain Rating and URL Rating of a linked domain and web page in Ahrefs’ Site Explorer, you may determine their relative strength:

2. Freshness

Freshness is a query-dependent ranking criteria, which means that certain searches place a higher priority on it than others.

For instance, all of the “Brexit news” results are really recent. Google even displays a “Top Stories” section with recent results.

Freshness doesn’t really matter when asking “how to knot a tie,” as the procedure never changes.

A handbook that is 10 years old can be just as effective as one that was just released.

That explains why both new and old pages are shown in Google’s top five results:

Read Also!!!

4. Search Intent

For each query, Google does not prioritize the same kind of material.

For instance, a person who searches for “purchase clothes online” is in the purchasing mood. They desire to view the available goods for purchase. Google displays e-commerce category pages as a result.

In contrast, someone who is looking up “how to tie a tie” is in a learning phase. Instead of buying one, they want to learn how to knot one. Google displays blog content as a result.

To learn the fundamentals of query optimization, it’s a great idea to examine the current top-ranking results for the “four C’s of search intent.”

There are four Cs:

  1. Content style
  2. Content type
  3. Content Format
  4. Content angle
  5. Content Depth

Covering everything that users are looking for is essential if Google is to rank the most beneficial result for the query.

But the issue here isn’t the length of the substance. Not always is longer content better.

Covering what the searcher considers significant and what they anticipate seeing is the goal.

Consider the query “top watch brands,” for instance.

Search intent analysis shows that users desire lists of the top luxury watchmakers and brands. To further understand what matters in terms of content, let’s examine the similarities among the top-ranking pages.

5. Page Speed

Since 2010, when it had an impact on 1% of desktop search queries, page speed has been a ranking criteria.

That changed in 2018 when Google included mobile searches to the ranking algorithm.

Even yet, “a tiny fraction of searches” are still impacted by the issue, which mostly impacts web sites that “provide the slowest experience to visitors.”

That is a crucial aspect. The goal here isn’t to beat rivals by a few milliseconds. Making ensuring your website loads quickly enough to avoid having a negative impact on users is more important.

What is that speed?

In 2018, Google said that the TTFB (Time to First Byte) for mobile sites should be less than 1.3 seconds and that consumers should be able to see information in under three seconds.

Additionally, they advise that a mobile web page’s overall size should be less than 500kb.

Take these recommendations with a grain of salt, though, as Google’s John Mueller stated just a few months ago that TTFB isn’t utilized for search ranking purposes.


I hope that has cleared up a lot of the misunderstanding about these particular aspects.

I prefer to consider how I’d code or scale anything whenever we discuss whether it is or isn’t a factor.

Often, I can see all the issues with utilizing it just by performing that mental exercise.

I genuinely think that when Google and Bing tell us that this or that is not a ranking factor, they are telling the truth.

In their responses, they may purposefully leave room for interpretation, and they do use precise language.

But I don’t believe they are lying to us.

  • 0 Comment

About Us01.


TechBango is an infotech consultancy company based in Canada and Nigeria. We started out as an IT consultancy company, but now we have a major focus in digital products for public consumption. We are now one of the fastest growing marketing agencies in Nigeria, as well as providing digital solutions to clients all over the world!

Latest Posts 02.

Contact Us