I've noticed that my robots.txt file, which I've never edited, says Disallow: /modules/ for Agent: *
To me this seems like it would keep bots out of the Forums folder, since it's a sub-folder of modules. But I'm pretty sure that my forums are crawled quite regularly. Am I mistaken in how it works? Does LEO have anything to do with this?
If I wanted to keep robots out of my forums for awhile, would I disallow: /modules/Forums, or just /Forums/, or something else?
How about my integrated G2 gallery, which is installed in public_html/gallery2, while the integrating module is installed in /modules/gallery/ but with LEO the URLs are like: