One of our readers recently asked us about how the.txt robots file can be customized for SEO improvement.
The Robots.txt file shows you how to browse the account, making it a very efficient SEO procedure.
We’ll show you how to make a perfect SEO robots.txt format.
What is the robots.txt file?
Robots.txt is a text file that website owners can create to let search engine robots know how to browse and index their website pages.
It is usually stored in the root directory of your website, also called the main folder. The basic format of a robots.txt file looks like this :
User-agent: [user-agent name] Disallow: [URL string not to be crawled] User-agent: [user-agent name] Allow: [URL string to be crawled] Sitemap: [URL of your XML Sitemap]
You may have multiple instruction lines to allow or exclude specific URLs, and to add multiple sitemaps. If you do not prohibit a URL, search engine robots assume that they are allowed to investigate it.
Here’s what a sample file robots.txt might look like :
User-Agent: * Allow: /wp-content/uploads/ Disallow: /wp-content/plugins/ Disallow: /wp-admin/ Sitemap: https://example.com/sitemap_index.xml
We authorized search engines in the example above robots.txt to browse and index files in our WordPress download folder.
After that, we prohibited search robots from exploring and indexing WordPress plugins and management folders.
Finally, we have included the URL of our XML site plan.
Do you need a.txt file for your WordPress site?
If you do not have a robots.txt file, the search engines will begin exploring and indexing the website. However, you won’t be able to tell search engines which pages or which folders they shouldn’t explore.
This won’t have much effect when you start a blog for the first time and don’t have much content.
However, as the web site develops and you have a lot of content, you would likely want to have more control over how your website is explored and indexed.
Here’s why search robots have a discovery quota for each website.
This means that they analyze a number of pages during an analysis session. If they do not complete browsing all the pages on your blog, they will come back and restart browsing in the next session.
This can slow down the indexing rate of your website.
You will solve this problem by prohibiting search robots from trying to explore unnecessary pages such as your WordPress Management pages, plugin files and your theme folder.
By prohibiting unnecessary pages, you save your discovery limit. This helps search engines to browse even more sites on your site and index them as fast as possible.
Another good reason for using robots.txt file is whether you want to prevent search engines from indexing an item or a page on your website.
This is not the safest way to mask the contents of the general public, but it will help you prevent it from occurring in the search results.
What does an Ideal Robots.txt file look like?
A very basic robots.txt file is used by many popular blogs. Their content can vary according to the specific site needs :
User-agent: * Disallow: Sitemap: http://www.abouredacoder.com/post-sitemap.xml Sitemap: http://www.abouredacoder.com/page-sitemap.xml
This robots.txt file allows all robots to index all contents and gives them a link to the web site’s XML sitemaps.
For WordPress blogs, we recommend the following rules in the robots.txt file:
User-Agent: * Allow: /wp-content/uploads/ Disallow: /wp-content/plugins/ Disallow: /wp-admin/ Disallow: /readme.html Disallow: /refer/ Sitemap: http://www.abouredacoder.com/post-sitemap.xml Sitemap: http://www.abouredacoder.com/page-sitemap.xml
This tells search robots to index all photographs and all WordPress files. It prohibits search robots from indexing WordPress plugin files, WordPress administration area, WordPress Read-me file, and affiliate links.
Through adding sitemaps to the robots.txt file, you help Google Robots find all of the website’s pages.
Now that you know how an ideal robots.txt file looks like, let’s see how you can build a robots.txt file in WordPress.
How to build a Robots.txt file inside WordPress?
There are two ways to create a robots.txt file inside WordPress. You should choose the method that suits you best.
Method 1: Change the Robots.txt file using All in One SEO.
All-in-one SEO also known as AIOSEO, is the best WordPress SEO plugin of the market used by over 2 million web pages.
It is easy to use and comes with a robots.txt file generator.
If you have not already installed the AIOSEO plugin, you can see our step-by-step guide on how to mount a WordPress plugin.
Note: Free version of AIOSEO is also available and has this feature.
If the plugin has been installed and activated, you can use it to create and edit your robots.txt file directly from your WordPress administration region.
Only go through All in one SEO »Tools to edit your robots.txt file.
First of all, enable the publishing option by clicking the “Enable custom.txt robots” button in blue. With this enabled option you can create a custom robots.txt file in WordPress. All in One SEO will show your existing.txt file in the ‘Robots.txt Preview’ segment at the bottom of your screen. This release will show the default rules which have been added by WordPress.
These default rules tell search engines not to explore your basic WordPress files, allow robots to index all the content, and include a link to your website’s XML site plans. Now you can add your own customized rules to improve your.txt robots for referencing. In order to add a rule, enter a user agent in the field “User agent.” The use of a * will apply the rule to all user agents. Then, indicate whether you want to “Allow” or “Forbid” search engine discovery. Then, enter the file name or directory path in the “Directory Way” field.
The rule will be applied to your robots.txt file automatically. To add another rule, click the “Add a rule” button.
We advise you to add rules before you create the ideal robots.txt format.
If you have finished, do not forget to click on the “save changes” button to save your changes.
Method 2. Change the Robots.txt file manually by using FTP.
For this, you would need an FTP client to edit the robots.txt file.
Only connect to your WordPress hosting account using an FTP customer.
Once inside, you can see the robots.txt file in the root folder of your website.
If you don’t see it, you probably don’t have a robots.txt file.
In this case, you can only build one. Robots.txt is a raw text file, which means that you can download it onto your computer and edit it using some raw text editor such as Notebook or TextEdit. After saving the changes, you can re-download them into the root folder of your website.
How is your Robots.txt file tested?
If you have created your robots.txt file, it is still a good idea to validate it with a robots.txt test tool.
There are several robots.txt testing tools, but we recommend using Google’s Search Console testing tool.
First of all, you must associate your website with Google Search Console. If you haven’t done this yet, check out our guide on how to add your WordPress site to Google Search Console.
Then you can use the robots testing tool from the Google Search console. Only choose your property from the drop-down column. The tool automatically recovers the robots.txt file from your website and highlights errors and warnings if found by it.
The purpose of optimizing your robots.txt file is to prevent search engines from exploring sites that are not open to the public. For instance, pages in your wp-plugins folder or pages in your WordPress administration folder.
A common misconception among SEO experts is that blocking the WordPress category, tags and archival pages would improve exploration rate and result in faster indexing and higher rankings. That is not so. This is also contrary to the guidelines of Google’s webmasters. We suggest that you follow the robots.txt format above to create a robots.txt file for your website. We hope that this post has helped you learn how to optimize your WordPress robots.txt file for SEO.
If you like this post, please subscribe to our Youtube channel for WordPress video tutorials. And on Twitter and Facebook you will find us.