Your shopping cart is empty!
You can edit your robots.txt directly from the OpenCart admin panel.
What is Robots.txt?
Robots.txt is a simple text file that webmasters create and place into the root of the website for search bots to read it before crawling the website.
Quick Tip!You can activate Robots.txt in Admin / SEO Module / Dashboard and click Setup Robots.txt
The robots.txt will be created and defualt OpenCart rules will be written. You can modify the default rules by visiting Admin / SEO Module / Settings / Robots.txt tab
Robots.txt is part of the robots exclusion protocol (REP) - list of standards that regulate how robots crawl the web. Meta Robots tag is also part of REP.
How to set up robots.txt in OpenCart?
OpenCart does not cary a robots.txt file, so you are required to create it yourself via ftp. You can use any code editor.
Here are some useful features you should know
1. Before you go live, you may want to keep your website under the radar of search crawlers. You can block them by adding this rule to you robots.txt.
2. You always want to keep only the most relevant pages in the search results to have the highest relevance ratio. You can block system pages by adding the following rule:
3. OpenCart has sorting features in the category and also filters etc. This can generate an unlimited amount of urls which are basically a duplicate of your category pages. You can avoid these results from being indexed too:
4. Robots.txt is also a good place to specify your sitemap.xml url.
Important!Robots.txt is not imperative to search bots and can be ignored. So it is always best to hide sensitive content with more secure ways.