sullivan answers questions about the future of the web
Sullivan and Solis – sos-wp.it
But, and here comes Solis' question, these platforms are huge and hungary phone number occupy a visually enormous space as well as in terms of visits and therefore they can cover other contents of smaller realities that perhaps have a truly superior value in terms of content.
What is missing, this is what Solis wants to focus on, is a filter that actually allows users to understand if the content offered to them is placed among the first results because it makes sense or just because it happens to be on a platform that has a brand and an important position in itself and therefore occupies most of the available space with its own shadow.
Sullivan's answer starts from his personal experience and from remembering that in reality it is not only the big forums that are pushed to the top but that, based on what you search for, you get answers that come from those forums where being an expert comes from being an enthusiastic user of a certain product or service.
Building a good robots.txt file means building a file that works so that the relationship with bots is no more and no less than what you decided it should be. Inside it are the directives that bots must then keep in mind when they do an in-depth examination of what is on your site built with WordPress.
Even if it doesn't seem like it, even an incorrect configuration of the robots.txt file can have repercussions on SEO performance, as we will see. Over time, various beliefs have spread online regarding how this file, which apparently seems like a very common text file, should be compiled. Even John Mueller, quite recently, has spoken about it several times .
On Google Search Central, however, there is now a new note (if you don't see it, we recommend you replace the language indication "it" with "en") that clarifies which are the only four fields supported by bots that work for the big G. These four fields are: user-agent, allow, disallow, sitemap .
Everything else is ignored.