Robots.txt Generator
🕹 Robots.txt Creation and Validation
# Robots.txt creation:
npx seo-master create robots --allowed /home,/about --disallowed /admin,/security --sitemap https://www.nayanui.com/sitemap.xml --output ./robots.txt
You can also use the shorter version of this command.
npx seo-master create robots -a /home,/about -d /admin,/security -s https://www.nayanui.com/sitemap.xml -o ./robots.txt
# Attributes
Name | Attribute | Type | Default | Details |
---|---|---|---|---|
Allowed paths | --allowed / -a | string | Allowed paths for the robots.txt. | |
Disallowed paths | --disallowed / -d | string | Disallowed paths for the robots.txt. | |
Sitemap | --sitemap / -s | string | Sitemap url for the robots.txt. | |
Output | --output / -o | string | ./robots.txt | Output path for the robots.txt. |
# Robots.txt validation:
Validate your Robots.txt both local and through URL.
npx seo-master validate robots --input ./robots.txt
You can also use the shorter version of this command.
npx seo-master validate robots -i ./robots.txt
You can also validate robots.txt of your live website by passing URL.
npx seo-master validate robots --input https://www.nayanui.com/robots.txt --isremote true
# Attributes
Name | Attribute | Type | Default | Details |
---|---|---|---|---|
Input Robots.txt | --input / -i | string | ./robots.txt | Input path for the robots.txt. |
Is Remote | --isremote / -ir | boolean | false | Pass true if robots.txt is hosted somewhere. |
# Tags