Robots.txt Generator

🕹 Robots.txt Creation and Validation

# Robots.txt creation:
npx seo-master create robots --allowed /home,/about --disallowed /admin,/security --sitemap https://www.nayanui.com/sitemap.xml --output ./robots.txt
You can also use the shorter version of this command.
npx seo-master create robots -a /home,/about -d /admin,/security -s https://www.nayanui.com/sitemap.xml -o ./robots.txt
# Attributes
NameAttributeTypeDefaultDetails
Allowed paths--allowed / -astringAllowed paths for the robots.txt.
Disallowed paths--disallowed / -dstringDisallowed paths for the robots.txt.
Sitemap--sitemap / -sstringSitemap url for the robots.txt.
Output--output / -ostring./robots.txtOutput path for the robots.txt.
# Robots.txt validation:
Validate your Robots.txt both local and through URL.
npx seo-master validate robots --input ./robots.txt
You can also use the shorter version of this command.
npx seo-master validate robots -i ./robots.txt
You can also validate robots.txt of your live website by passing URL.
npx seo-master validate robots --input https://www.nayanui.com/robots.txt --isremote true
# Attributes
NameAttributeTypeDefaultDetails
Input Robots.txt--input / -istring./robots.txtInput path for the robots.txt.
Is Remote--isremote / -irbooleanfalsePass true if robots.txt is hosted somewhere.