Search

This is very common in two contexts

This is very common in two contexts

Content that should not be crawl (is not relevant to the search) and content that should not be index (is restrict or confidential), Content that should not be track Let’s say you put together company material to show employees, And he found it easier to put it on a page on the website, to facilitate distribution, This particular content is not necessarily confidential, but it has no relevance to anyone outside the company, However, if you put it on the website, the Google robot will find this material and make it available in the search, In principle, this is not a problem, but in practice, you are asking the search engine to.

The content of this section should not be display

Spend time crawling a useless page, when it could be doing the same with another page that is really important to your audience, To prevent this from happening, the developer nes to inform Google that this and other pages should not be crawl, It can do this by iting the robots,txt file, Content that should not be index This case happens when your website has a restrict area, for example, The content of this section should not be display in the search, as it is restrict to some users only, Here the developer also nes to inform Google not Latest database  to index this material, To do this, he must insert a code on the page in question, indicating to the robot that it should not be index, The difference, in this case, is that non-index content will not appear on Google.

Checking tracking blocking commands

 Untrack content may appear, but the robot will not track it to update its display, Crawling and indexing errors What do these two features have to do with the site being block? What happens is that the website developer may have appli both practices incorrectly, informing Google that it should not index and/or crawl the website, To identify if this is happening to you, take the tests below, Checking tracking blocking commands: enter your website address and then add the robots,txt command (ex: flammo,com,br/robots,txt); if the address returns nothing, there are no tracking  ATB Directory blocks; if the address opens a text file, look for the “disallow.

olgvp

leave a comment