×

Due to the rise of technology, many business owners have come to appreciate what digital platforms and search engine optimization (SEO) can do—to boost the strength and reach of their business. 

 

But as SEO evolves, the best practices also change. It means yesterday’s good strategy may not work so well today. That is quite true for robots txt and XML sitemap since they are almost as old as SEO itself. 

 

Even if that is the case, these two old techniques remain relevant. If properly used, robots txt and XML sitemap would help optimise a website from the ground up, gaining more traffic from search engines.

 

To clear up the confusion and convince you about their importance, we’re going to discuss why robots txt and XML sitemap still matter in SEO.


What Is Robots Txt?

SEO Robots Txt

 

Robots txt is a file that helps you guide search engine crawlers or web bots on which pages to crawl and should not crawl. If created properly, it can direct the crawlers to index the significant parts of the website and dismiss the unnecessary ones.

 

What happens if you do not set up the robots txt?

 

Your website can still be visible online, but it would not be the priority of search engines. Also, when set up incorrectly, expect to experience disastrous aftermath—your site not being seen on search results at all. 

 

If you arrange robots txt right, this small file that you add to the back end of your website can save you a lot of trouble and gives your business the boost it needs. Your other SEO marketing strategy would work significantly.

 

Imagine how your business will improve if the website that’s made to reach out to more potential customers out on the web is not even indexed. Your competitors would only step on you like dirt. And for your business, you should not let that happen.

 

How and where do you get such a file? 

 

The standard set up or format, if you will, is like this:

 

User-agent: [user-agent name]

Disallow: [URL string not to be crawled]

Allow: [URL string you want to be crawled]

XML sitemap: [URL of your sitemap.xml]

 

For the user-agent, you can specify particular web crawlers such as Bingbot, Googlebot, or Baiduspider, etc. Make sure to choose which one you want to command. Doing so should allow your website to rank on that specific search engine and make your SEO strategies work better there. 

 

For example, you do not want Bingbot to crawl your site. The set-up you need is this: 

 

User-agent: Bingbot 

Disallow: /nobingbot/

 

Following this format allows you to delay the crawl of web crawlers. The number 20 above means that the web crawlers have to wait 20 seconds before crawling to pages of the website. You could change the figure, depending on how much delay you want to happen. 

 

Meanwhile, the Disallow segment of the robots txt is the part where you specify which pages the crawlers should not crawl. You do not have to place the whole URL of the web page in it; you only need to do it like this:

 

Disallow: /search/realtime

Disallow: /search/users

 

You can also add in specific pages you want to allow robots to crawl, simply by using “Allow” instead of “Disallow” like:

 

Allow: /search/realtime

Allow: /search/users

 

The last point is the XML sitemap. This part is where you have to place the URL of the webpage so the search engine crawlers can go through that file immediately. 


What Is XML Sitemap?

SEO XML Sitemap

 

Speaking of an XML sitemap, this is a file that serves as a list of all the pages of the website. Since its primary goal is to help web bots crawl those web pages better, it will be a good road map for your website. If done right, the XML sitemap can guide the search engines to visit all your significant pages.

 

It should be clear to everyone that the XML sitemap is very significant and still relevant for SEO today. Every SEO company uses it so the website would be seen quicker by search engines and its web bots. Remember that Google and other search engines rank pages, not just websites in general.

 

Remember that Google and other search engines rank pages, not just websites in general. The XML sitemap then directly helps with your business, being seen out there on the web. So if you have changes on your pages, you also need to update the XML sitemap. 

 

How and where to get this? 

 

There are XML sitemap generator tools online that help generates XML Sitemaps to make things easier for your SEO, such as:

 


Conclusion

Even though many considered Robots txt and XML sitemap as old SEO practices, they remain significant. That is why you should consider including them to further optimise your website. 

 

To learn more about them and other SEO techniques, get in touch with our SEO agency in Singapore. Contact OOm at 6391 0930.

Ready to Elevate your Business to the Next Level?

Discover how we can help you to achieve your business goals through digital marketing in just 30 minutes.

    Get a Quote