Website optimization | Which Pages Of A Site Should Be Closed From Indexing? Manually or automatically, websites can create pages that are useful to users or necessary to ensure normal operation, but which do not need to be “given” to search robots for indexing. Such pages have to be forced to be closed from indexing so that they cannot fall into the search results. How this can be done and which pages should be closed are discussed in more detail below.
If the site’s pages duplicate existing texts or contain only official data, such pages can be not only useless in search engine promotion, but even harmful. Indeed, these pages for non-unique or uninformative content may be considered poor by search engines, as a result of which the overall performance of the site can be underestimated.
Of course, you can’t completely delete pages from the site. Firstly, this can lead to the appearance of a 404 error, which is also harmful for search promotion, and secondly, these pages may be needed by users or the administration. In this case, the solution is to close the site’s pages from indexing: users will see the necessary pages, but search robots will not.
When individual pages of a site are closed from indexing, it is important to carefully select such pages. Otherwise, you can accidentally close the “right” pages, and this will not lead to anything good: if the search robots cannot index pages with unique thematic content, then the site will not be able to achieve high positions in the search results.
Table of Contents
Pages Of The Administrative Part Of The Site.
The pages of the administrative part are intended strictly for official use, and they certainly should not be indexed by search robots. As a rule, such pages are initially closed in the robots.txt file automatically generated by various ready-made CMS using the Disallow directive. If the robots.txt file is created manually or has undergone changes, it is important to check that the indexing ban is on all pages related to site management.
Personal Information Pages.
There are similar pages on sites of various types, including in particular forums, blog platforms, social networks. These pages have practical benefits for site visitors, however, allowing them to be indexed is harmful because of non-unique content, because the content of such pages differs only slightly.
Site Search Results Pages.
As in the case with pages containing personal user data, indexing by search robots is not necessary: the results pages are useful to visitors, but from the point of view of search engines they are “garbage” because they contain non-unique content.
Duplicate Site Pages.
Pages containing identical information can be created on the site due to the peculiarities of its managing system (CMS). For example, this situation is especially often observed in online stores, where individual pages can be formed for filters and sorting, as well as for tags and tags.
Basket pages and checkout pages or order pages.
Of course, the basket pages and the checkout pages are in online stores, and the request pages can also be found on other types of sites. These pages do not have a semantic load, and therefore they need to be closed from indexing by search robots.
How exactly can one close various pages of a site from indexing by search engines? There are various ways, including writing prohibitive directives in the robots.txt file, adding a robots meta tag to the page code, using the so-called 301 redirect, and prohibiting indexing in .htaccess. The choice of the desired method depends on the features of the pages to be closed, and the solution is best provided to the website optimization specialists for the most effective result.
For professional help Contact Us today a professional SEO Agency Dubai.
Call 00971567300683