You did your best, but the site still doesn’t appear in Google results?
Bad news: there are a lot of reasons.
Good news: almost everything can be fixed.
In this article, we consider 9 main reasons why the site is not ranked by Google and give step-by-step instructions on how to fix it.
To be ranked on Google, there are three key factors to consider:
1. The search engine knows about the existence of your site, and all the important pages are accessible to search bots and indexed.
2. The site has a page relevant to the query by which you want to see the site in the search results.
3. The page meets the quality criteria of the search engine.
The main reasons why your site doesn't have rankings and what to do about it
1. Your website is new
The most obvious reason for the lack of rankings in search results is that Google does not know about it yet. To check if a site is in the search engine index, enter: site: yourwebsite.com
If this query contains at least one result, then the search engine has already found and indexed the site. But even if the search engine is already familiar with the site, this does not mean that it has indexed the page that you expect to see in the results. To check the availability of a promoted page in the SERP, enter: site: yourwebsite.com/page-you-want-to-see-in-google/
Such a request should receive one answer. If the search engine does not see the page you need, then create a sitemap and send the file to Google: Search Console > Sitemaps > Add Sitemap > Submit
The sitemap tells the search engine which pages are important and where to find them.
2. The site has blocked access to search robots
If you tell Google not to show specific pages in a search, it won’t. The ban on indexing is set using the noindex meta tag:
Pages that are closed by this meta tag are not indexed by the search engine, even if they are on the site map, and the Sitemap has been added to the Search Console. You may not remember that you used the meta tag, but this does not mean that it is not in the code.
This feature is often used by developers during the creation of a site to prevent indexing of unprepared pages, and then they just forget to turn it off.
3. The site blocks access to search robots because of the robots.txt
The robots.txt file tells search bots on how to behave on the site. Search bots will not be able to bypass pages that are prohibited by the file. And this means that these pages will not appear in the results.
The access for robots can be checked manually in the robots.txt file. You can find it like this: yourdomain.com/robots.txt
If error 404 is returned, then you do not have a robots.txt file.
What should not be categorically in the file:
under one of:
This command prevents Googlebot from crawling pages on the site. It is important that disallow does not stand for important sections of the site.
Remove all directives that prohibit Google from processing content if you want pages to appear in search results.
4. The site lacks quality links
Even if Google knows about the existence of your page, this does not mean that it will be displayed in the results. It is necessary to show the authority of the page to the search engine.
Although hundreds of factors influence the ranking, inbound links from unique domains remain the strongest signal. This is proved by numerous studies by Ahrefs:
If the pages at the top of the search have much more inbound links, this becomes actually the primary reason for the low ranking of your site.
Note that Google ranks pages, not sites. Therefore, it is important to consider the number of unique high-quality links to a specific page and not to the site as a whole.
5. The page lacks credibility and authority
The Google algorithm is based on PageRank - it is a metric of credibility that takes into account factors of external and internal links as votes. Although PageRank is no longer available for tracking, in 2017, Google confirmed that PageRank still remains a strong ranking factor.
It is impossible to find out which PageRank is assigned to the site. But this does not mean that this metric cannot be influenced in any way.
Ahrefs tools have a metric that is calculated using a formula close to the original PR.
The volume of search traffic on a site depends on the authority: the higher the authority, the more search traffic the site receives.
You can check the authority of the site in the free Backlink Checker tool:
To understand why the site does not appear in search results or in high positions, you need to analyze sites from the top and compare their UR. This report immediately contains the answer to the question of what to do. Together with the UR, the number of links in the profile and the number of unique referring domains is displayed. If the difference is global, then it’s worth it to build link mass.
Also, to increase the credibility of the promoted page, internal links are important. It is also important to maintain relevance.
6. The site lacks authority
Google does not give an exact answer to whether there is such a ranking signal as the authority of the site. Representatives of the search either hint at authority or claim that there is no such metric in the ranking formula.
Ahrefs has a metric that reflects the authority of a domain. The study shows that the higher the credibility (according to Ahrefs), the higher the position of the site.
You can find out DR in the same free tool Backlink Checker.
7. The page doesn't match the user's search intent
The main task of the algorithm is to show the user above those results that most fully satisfy his intention. That is why it is extremely important to make the page what the user expects to see.
If the page does not match the search intent, then even with high authority metrics, it will not appear in the search results.
8. The site has duplicate content
Duplicate content is similar to content hosted on different URLs. Search engines try not to index duplicate content so as not to occupy a place in the index. Instead, the page marked with the canonical attribute is indexed.
If no pages have received the canonical attribute, then Google will try to determine for itself which version is more relevant, relevant and deserves a place in the index.
The Google algorithm is far from perfect, and so, both pages can get into the index.
9. The site is under Google penalties
This is the least likely reason for non-ranking on Google, however, it also has a place to be.
Sanctions can be manual and algorithmic. Google announces manual sanctions in the Search Console.
With algorithmic sanctions, everything is somewhat more complicated, since the search does not notify the webmaster of a decrease in position or search visibility. Algorithmic sanctions can be suggested by the drawdown of search traffic.
If you notice a drawdown in organic traffic, then, first of all, you need to check if there were any updates to the main algorithm in the recent past. If you see that traffic has decreased after the update, then the only solution is to work with the quality of the content and the authority of the site.
In order to do it, see our checklist.