# robots.txt generated to Block certain bots User-agent: Baiduspider Disallow: / User-agent: AhrefsBot Disallow: / User-agent: Ezooms Disallow: / User-agent: MJ12bot Disallow: / User-agent: YandexBot Disallow: / # robots.txt generated to reduce load on server User-agent: * Crawl-delay: 30 # robots.txt generated for Private files User-agent: * Disallow: /cgi-bin/ Disallow: /private/ Disallow: /tmp/ Disallow: /wp-includes