The syntax is very simple.
You assign rules to bots by indicating their user-agent (the search engine bot), followed by directives (the rules).
You can also use the asterisk (*) character to assign directives to any user agent. This means that the given rule applies to all bots, rather than to a specific one.
For example, the instructions would be like denmark number for whatsapp this if you wanted to allow all bots except DuckDuckGo to crawl your site:
robots.txt disallow
Note : A robots.txt file provides instructions, but it cannot enforce them. It is like a code of conduct: good bots (like search engine bots) will follow the rules, while bad bots (like spam bots) will ignore them.
The robots.txt file is hosted on your server , just like any other file on your website.
You can see the robots.txt file for a particular website by typing the full homepage URL and adding /robots.txt , such as .
robots.txt url sample
Note : A robots.txt file should always be located at the root of your domain. So, for the site , the robots.txt file is located at obots.txt . Otherwise, crawlers will think you don't have one.
Before learning how to create a robots.txt file, let's look at the syntax it contains.
Robots.txt Syntax
A robots.txt file consists of:
one or more blocks of "directives" (rules);
each with a specific "user-agent" (search engine bot);
an "allow" or "disallow" statement.
A simplified block might look like this: